MLC Chat 17+

Chat with Open Language Models

Tianqi Chen

Designed for iPad

    • 4.2 • 58 Ratings
    • Free

Screenshots

Description

Running Large language models locally on your phone

MLC Chat lets users chat with open language models locally on ipads and iphones. After a model is downloaded to the app, everything runs locally without server support, and it works without internet connections do not record any information.

Because the models run locally, it only works for the devices with sufficient VRAM depending on the models being used.

MLC Chat is part of open source project MLC LLM, with allows any language model to be deployed natively on a diverse set of hardware backends and native applications. MLC Chat is a runtime that runs different open model architectures on your phone. The app is intended for non-commercial purposes. It allows you to run open-language models downloaded from the internet. Each model can be subject to its respective licenses.

What’s New

Version 1.2

This version add mistral support

Ratings and Reviews

4.2 out of 5
58 Ratings

58 Ratings

DarkCougar ,

Neat to use a LLM on my phone but…

I was disappointed to see that I could not save a conversation other than copying one response at time.

When I put app to background then returned the conversation closed and I got back to main screen. When going to conversation screen I find the app was reset and my earlier data is not present.

Still, I’m impressed to have the chance to play with models on my phone.

Chaosmouse ,

Offline GPT alternative

Definitely not as good as GPT4 models, but it runs offline locally, and it’s free. Pretty amazing.

*feature request* it would be really nice if I could delete models that I’ve added. The Huggingface site for MLC AI doesn’t explain the vram requirements per model, so my current workaround is to delete the app and all data then reinstall it every time I try a new model. So far it looks like only the preinstalled model is compatible with iPad mini 6 and iPhone 13 mini based on requested VRAM requirements, but reinstalling a 1.6GB app every time seems a little unnecessary. Would also be nice if we could change the settings per model for things like repetition penalties, temperature, etc. Finally, a how to for how to import any other LLM (or recompile them for use if necessary) would be really awesome.

Lisa Macintosh ,

Please add the ability to edit responses and the system prompt

This app is barebones but the newly added Mistral model is amazing for something running locally on an iPhone.

The app is barebones though and it would be nice to be able to edit the system prompt and responses.

App Privacy

The developer, Tianqi Chen, indicated that the app’s privacy practices may include handling of data as described below. For more information, see the developer’s privacy policy.

Data Not Collected

The developer does not collect any data from this app.

Privacy practices may vary, for example, based on the features you use or your age. Learn More

You Might Also Like

YourChat
Productivity
HuggingChat
Productivity
Private LLM - Local AI Chatbot
Productivity
OpenCat
Productivity
Rewind: Truly Personalized AI
Productivity
Pal Chat - AI Chat Client
Productivity