Unfortunately doesn’t work on the iPad 9th Gen with any other model than the useless demo model. If you have anything lower than a Silicon based iPad just a waste of money. On Silicon based Macs there are other much better free solutions.
Thanks for the review. On-device LLM inference is memory intensive. Your 9th gen iPad that was released in 2021 and discontinued last year in 2024, only has 3GB of RAM, of which apps can only use about 2GB. This is sufficient to run only the smallest of models. Our app's USP is in high performance LLM inference and state of the art model quantization, which leads to higher accuracy compared to naive round to nearest quantization used by all the free "solutions". If you're willing to compromise on either of those, then this app likely isn't right for you. Please go to reportaproblem.apple.com and request a refund. Also, unlike the inferior "free solutions" that you're advocating for, this app runs on macOS, iOS and iPadOS. We're not aware of any other app, free or otherwise that's capable of doing this.