Viam App interface showing a new machine instance named smart-assistant, demonstrating the process of setting up a local LLM for machine learning in robotics.
Viam App interface showing a new machine instance named smart-assistant, demonstrating the process of setting up a local LLM for machine learning in robotics.

Getting Started with Local LLM Machine Learning for Robotics Using Viam

Integrating Large Language Models (LLMs) into robotics opens up exciting possibilities for creating more intelligent and interactive robots. Running LLMs locally on your robot enhances privacy, reduces latency, and allows for operation in environments with limited or no internet connectivity. This guide will walk you through setting up a local LLM on a single-board computer (SBC) for robotics applications using Viam, a hardware-agnostic robotics platform. This approach empowers you to explore the world of Llm Machine Learning directly on your robot, making it smarter and more responsive.

To get started, Viam’s platform allows you to use various hardware. While a Macbook can be used for initial exploration, for a truly portable and embedded robotics experience, we recommend using a single-board computer like the Orange Pi 5B (or any octa-core CPU SBC for optimal performance) or a Raspberry Pi 5.

The first step is to install viam-server on your chosen SBC. Follow the comprehensive Viam installation instructions for detailed guidance. Once installed, navigate to the Viam App and create a new machine instance. For this guide, we’ll name our instance “smart-assistant”, but feel free to choose a name that suits your project.

Now, let’s configure the necessary components and services for our robot, starting with a camera. Viam offers broad compatibility with different cameras. For simplicity and ease of setup, we’ll use a webcam. Webcams are generally plug-and-play with Viam and can be configured within the Viam app in just a few clicks, as Viam automatically detects any connected cameras.

With the camera set up, we can now integrate the core of our project: the local LLM. In the Viam App, access the Service configuration menu and select ‘chat’ to add the local LLM service. This is where the magic of llm machine learning comes to life on your robot.

Detailed configuration attributes are available in the local LLM module README. For a quick start, the default settings are usually sufficient.

After configuring the LLM module, let’s verify its functionality. We’ll use a simple Python script and the Viam Python SDK for this test. If you haven’t already, install the Viam Python SDK by following the installation guide. Then, install the chat API required by the local LLM module by running the following command in your terminal:

pip install chat_service_api@git+https://github.com/viam-labs/chat-service-api.git@main

Next, create a Python script on your machine, replacing api_key, api_key_id, and your_robot_address with the correct values obtained from the Code sample tab in your Viam app.

import asyncio
from viam.robot.client import RobotClient
from viam.rpc.dial import Credentials, DialOptions
from chat_service_api import Chat

async def connect():
    opts = RobotClient.Options.with_api_key(
        api_key='<your_api_key>',
        api_key_id='<your_api_key_id>'
    )
    return await RobotClient.at_address('<your_robot_address>', opts)

async def main():
    robot = await connect()
    llm = Chat.from_robot(robot, name="llm")
    response = await llm.chat("what is a robot?")
    print(response)
    # Don't forget to close the machine when you're done!
    await robot.close()

if __name__ == '__main__':
    asyncio.run(main())

Executing this script should yield an output similar to this:

“A robot is an autonomous machine designed to perform specific tasks with limited human control. They are programmed to follow predetermined instructions or rules, making them highly efficient and reliable. Robots are used in various industries, from manufacturing and assembly to healthcare and transportation.”

Congratulations! You have successfully set up a local LLM, leveraging llm machine learning, that runs directly on your machine, enabling your robot to generate responses and understand commands locally. This is a significant step towards creating more intelligent and privacy-conscious robotic systems. By using Viam, you’ve simplified the integration of complex llm machine learning models into your robotics projects, opening the door to a world of innovative applications and explorations in edge AI and autonomous systems.

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *