Invoke - Local LLM Client 4+

Chat with Local LLM

kazuhiko sugimoto

Designed for iPad

    • 4.0 • 1 Rating
    • Free

Screenshots

Description

Note: This app is intended for users who are able to set up a local LLM server (Ollama or LM Studio) within their own LAN environment. Some technical setup is required.

Chat with your local LLM! Seamlessly connect to Ollama or LM Studio for a fully offline, privacy-focused AI chat experience!

This iOS app connects to a locally hosted Large Language Model (LLM) server and enables seamless, natural conversations.
Compatible with Ollama and LM Studio via HTTP, it provides real-time message streaming and intuitive chat history management.
The app operates entirely within a local network—no internet connection required—making it ideal for those who prioritize privacy and security.

Key Features:
- Easy connection to local LLM servers (Ollama / LM Studio)
- Natural chat UI with bubble-style layout
- Auto-saving and browsing chat history
- Server and model selection via settings screen
- Supports Dark Mode

What’s New

Version 1.1.1

- performance improvements
- usability improvements
- bug fixes

Ratings and Reviews

4.0 out of 5
1 Rating

1 Rating

App Privacy

The developer, kazuhiko sugimoto, indicated that the app’s privacy practices may include handling of data as described below. For more information, see the developer’s privacy policy.

Data Not Collected

The developer does not collect any data from this app.

Privacy practices may vary, for example, based on the features you use or your age. Learn More

More By This Developer

Japanese News Player
News
Photo Calc - Simple Calculator
Productivity
Tech Info Player
News
Feedable - Simple RSS Reader
News