
Invoke - Local LLM Client
Chat with Local LLM
Free · Designed for iPad
Note: This app is intended for users who are able to set up a local LLM server (Ollama or LM Studio) within their own LAN environment. Some technical setup is required.
Chat with your local LLM! Seamlessly connect to Ollama or LM Studio for a fully offline, privacy-focused AI chat experience!
This iOS app connects to a locally hosted Large Language Model (LLM) server and enables seamless, natural conversations.
Compatible with Ollama and LM Studio via HTTP, it provides real-time message streaming and intuitive chat history management.
The app operates entirely within a local network—no internet connection required—making it ideal for those who prioritize privacy and security.
Key Features:
- Easy connection to local LLM servers (Ollama / LM Studio)
- Natural chat UI with bubble-style layout
- Auto-saving and browsing chat history
- Server and model selection via settings screen
- Supports Dark Mode
Ratings & Reviews
This app has not received enough ratings or reviews to display an overview.
- performance improvements
- usability improvements
- bug fixes
The developer, kazuhiko sugimoto, indicated that the app’s privacy practices may include handling of data as described below. For more information, see the developer’s privacy policy .
Data Not Collected
The developer does not collect any data from this app.
Accessibility
The developer has not yet indicated which accessibility features this app supports. Learn More
Information
- Provider
- kazuhiko sugimoto
- Size
- 2.9 MB
- Category
- Developer Tools
- Compatibility
Requires iOS 16.6 or later.
- iPhone
Requires iOS 16.6 or later. - iPad
Requires iPadOS 16.6 or later. - Mac
Requires macOS 13.5 or later and a Mac with Apple M1 chip or later. - Apple Vision
Requires visionOS 1.0 or later.
- Languages
- English and Japanese
- Age Rating
4+
- 4+
- Copyright
- © 2025 kas apps