LLM Farm 4+

Run LLM

Artem Savkin

Conçue pour iPad

    • Gratuit

Captures d’écran

Description

Now with Shortcuts support.

LLMFarm is an iOS and MacOS app to work with large language models (LLM). It allows you to load different LLMs with certain parameters.

# Features
* Various inferences
* Various sampling methods
* Metal
* Model setting templates
* LoRA adapters support
* LoRA FineTune and Export

# Inferences
* LLaMA
* GPTNeoX
* Replit
* GPT2 + Cerebras
* Starcoder(Santacoder)
* RWKV (20B tokenizer)
* Falcon
* MPT
* Bloom
* StableLM-3b-4e1t
* Qwen
* Gemma
* Phi
* Mamba
* Others

# Multimodal
* LLaVA 1.5 models
* Obsidian
* MobileVLM 1.7B/3B models

Note: For Falcon, Alpaca, GPT4All, Chinese LLaMA / Alpaca and Chinese LLaMA-2 / Alpaca-2, Vigogne (French), Vicuna, Koala, OpenBuddy (Multilingual), Pygmalion/Metharme, WizardLM, Baichuan 1 & 2 + derivations, Aquila 1 & 2, Mistral AI v0.1, Refact, Persimmon 8B, MPT, Bloom select llama inferece in model settings.

Sampling methods
* Temperature (temp, tok-k, top-p)
* Tail Free Sampling (TFS)
* Locally Typical Sampling
* Mirostat
* Greedy
* Grammar (dont work with GGJTv3)

Nouveautés

Version 1.2.0

Changes:
* Added shortcuts support
* llama.cpp updated to b2864
* Some llama-3, Command-R fixes
* Added llama3 instruct template
* Fixed a bug that could cause the application to crash if the system prompt format is incorrect
* Fix memory bug in grammar parser
* Fixed some other bugs

Notes et avis

4,5 sur 5
16 notes

16 notes

Feeling defrauded ,

Looks promising

This app looks pretty promising, but it’s a little bit daunting to someone who’s not as familiar with setting up LLMs. For example, how do you download the LLMs and where do you go to get them? Which alarms are likely to work? It might be a good idea to include specific LLMs that have been tested on which devices. Some sort of tutorial or instructions would be really useful.

Réponse du développeur ,

Thanks for the feedback. I try to make the application more understandable and user-friendly as I refine it, but the main focus is on functionality.

takemusuaiki ,

Impressive start.

Love this app. Surprisingly powerful and has tons of tweaking options that many apps on full computers lack. I use it on phone and iPad. I would love to see support for newer Gemma models and the ability act as an inference server like ollama or LM studio so I can run local inference for Obsidian or other apps.

Réponse du développeur ,

Thank you for the feedback!

Jadeee13 ,

Exactly what I’ve been waiting for!

This app is exactly what I’ve been waiting for! It lets me import what ever open source LLM I want to and runs completely local and private. Thank you for creating this! 🙏

Réponse du développeur ,

Thank you for your feedback, it means a lot to me.

Confidentialité de l’app

Le développeur Artem Savkin a indiqué que les pratiques de l’app en matière de confidentialité peuvent comprendre le traitement des données comme décrit ci‑dessous. Pour en savoir plus, consultez la politique de confidentialité du développeur.

Data Not Collected

The developer does not collect any data from this app.

Les pratiques en matière de confidentialité peuvent varier selon les fonctionnalités que vous utilisez ou selon votre âge. En savoir plus

Vous aimerez peut-être aussi

Enchanted LLM
Developer Tools
Server: Host Files Locally
Developer Tools
AWS IoT Sensors
Developer Tools
Easy CSV Editor Mobile
Developer Tools
Blueterminal
Developer Tools
LogSnag
Developer Tools