Michael Plis

Apr 253 min

How to build your own AI chatbot on the computer?

Updated: 3 days ago

Will we teach our own ai models? Photo by Andrea De Santis on Unsplash

In today's AI landscape, the allure of generative AI extends beyond cloud services to local installations on personal computers. This blog delves into the benefits and practicalities of bringing this cutting-edge technology directly to your device, offering a glimpse into the future of AI accessibility and innovation. I'll show you a method of how to build your own AI chatbot. Read on.

We'll explore the fundamentals of local generative AI, from understanding the underlying models to navigating installation processes with user-friendly tools like LM Studio. By demystifying the complexities and offering practical guidance, we aim to empower readers to embark on their own AI exploration journey. Whether you're a novice or an enthusiast, join us as we unlock the potential of generative AI, right at your fingertips.

Bringing AI Power to Your Device

Many are familiar with generative AI tools like ChatGPT and Google Bard, typically accessed through cloud services. However, there's a way to tap into this technology directly on your own computer.

Installing generative AI locally offers privacy benefits and eliminates concerns about capacity or availability issues. Plus, it's just plain cool to have that kind of power at your fingertips.

Understanding the Basics

To embark on this journey, you'll need both a program to run the AI and a Large Language Model (LLM) to generate responses. These LLMs serve as the backbone of text generation AI, with GPT-4 driving ChatGPT and Google Gemini.

While delving into the realm of LLMs may seem daunting, they essentially function as supercharged autocorrect engines, trained on vast amounts of data to recognize relationships between words and sentences.

Exploring Available Models

There's a variety of LLMs you can install locally, including those released by Meta (like LLaMa) and others developed by researchers and volunteers. Publicly available LLMs aim to foster innovation and transparency, making them accessible to a broader audience.

For this guide, we'll focus on LM Studio, a user-friendly option for installing LLMs on Windows, macOS, and Linux systems.

LM Studio Capabilities & System Requirements

With LM Studio, you can ...

🤖 - Run LLMs on your laptop, entirely offline

👾 - Use models through the in-app Chat UI or an OpenAI compatible local server

📂 - Download any compatible model files from HuggingFace 🤗 repositories

🔭 - Discover new & noteworthy LLMs in the app's home page

LM Studio supports any ggml Llama, MPT, and StarCoder model on Hugging Face (Llama 2, Orca, Vicuna, Nous Hermes, WizardCoder, MPT, etc.)

Minimum requirements:

M1/M2/M3 Mac, or a Windows PC with a processor that supports AVX2. Linux is available in beta.

Build your own AI chatbot with LM Studio

Getting started with LM Studio involves downloading the software from the official website and ensuring your system meets the minimum requirements, such as sufficient RAM and VRAM.

Once installed, you can explore and download LLMs within the application.

LM Studio simplifies the process by recommending notable LLMs and providing options to filter and manage installed models.

With LM Studio, you can engage in prompt-based interactions with the selected LLM, controlling various settings to tailor the AI's responses to your preferences.

Embarking on AI Exploration

With local LLMs up and running, the possibilities for AI-driven interactions are vast. While delving deeper into LLM development may require additional learning, LM Studio streamlines the setup process, even for beginners.

Whether you're curious about AI technology or eager to experiment with text generation, harnessing generative AI locally offers a fascinating glimpse into the future of human-computer interaction.

Future of local AI models

The future of local AI models will keep growing. They'll become lighter and easier to install and run on software that you can feed data as simply as adding documents to feed the beast.

I envision each person customising their own AI chatbot with the knowledge that they have gathered in their personal documents and personal beliefs.

For example, I would like to feed it with Bible and publications that matter to me and my own files and have a digital assistant that I can talk with and study with and learn from my own knowledge and knowledge that matters to me which sometimes gets lost in files on the computer.

Remember humans and and animals and our surroundings matter more than training AI models always find the time to go outside and smell the roses as it were.

Happy AI learning

Michael Plis

References

LM Studio

https://lmstudio.ai/

You Can Run a Generative AI Locally on Your Computer

https://lifehacker.com/tech/how-to-run-generative-ais-locally-on-your-computer

    150
    0