Home MOBILE Swift app for Mac especially for MLX-LLMS

Swift app for Mac especially for MLX-LLMS

0


Experiments with large language models (large language models, LLM) on the home computer have been fashionable since campaigning around Deepsek. Apple-Silicon-Macs are well suited for this based on RAM expansion levels and contained processor version. Apple also released a special structure for acceleration of machine learning (ML) for Arm Macs with MLX. The purpose of “array structure for Apple silicon” is to make machine learning particularly efficient on the current MAC. A new free app available in Mac App Store is now special among models that use technology.

Advertisement


Pico AI Homlab The provider comes from Starling Protocol Inc. and runs from Macos 15 (Sequoia). Equal to LM StudioWhich runs as an electron app, it is very easy to try different models. Currently more than 300 should be different. In addition to the distilled variants of Deepsek R1 in various versions, Mistral, Meta Lama, Alibaba Quven, Google Jemma and Microsoft PHI also include various sizes and types. These are adapted to each MLX, which should make it more performance than GGUF model,

Pico AI Homelab is also Alaama Also uses Sangat and its API. So you can also use alternative chat apps such as open webui, mindmac or ollamac. In general, Pico AI Lab runs as a local HTTP server, so chats are made in the browser. The entire system runs offline and no data is sent online. Pico AI Homlab does not collect any user information.

E-scooter segway naubot ZT3 Pro D in Testing

Pico AI Homlab runs from M1 to all apple-silicon-mac. The minimum RAM expansion should be 16 GB, but more than 32 GB of RAM is extremely useful for large voice models. Command line skills are not necessary for the use of Pico AI Homelab. “Thanks to the directed installation with only one click, even beginners may begin early, while experienced users benefit from flexible adaptation options,” manufacturers write.

The app is currently completely independent, whether it will change later, it is not clear. There is no API cost or membership fee for AI providers. With local LLM, however, you should not expect too much: due to much less computing power than the server model, the outputs are worse, and more hallucinations. Local models are happy for “playing”.


(BSC)

Not young-free: Apple leaves the first porn app on iPhones in the European Union

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Exit mobile version