A few weeks ago, while on vacation, I had a conversation with my friend Larry who has been developing his own games and AI for a few years. I was mentioning how I was trying to find the most afforable service - OpenAI, Claude, ChatGPT, etc. (The height of my knowledge was to say that its cheaper to use the API than a monthly subscription) I displayed my ignorance by opening my mouth!
He was dumbfounded and asked why I was using a service - why not download the models to my laptop and run them there. A year ago I listened to several podcasts and even took a course from Dr. Andrew Ng about AI… but it was so theoretical. Even then, LLama had just came out and I tried to install it locally… but due to hardware limitations… nothing worked… so I gave up.
Since then, I bought an M3 Macbook - and Larry’s challenge sparked an interest.
Then in the airport - he showed me a repo to download. Our first attempt failed since I did not have a working Python environment… but after Googling some things I found some bundled apps…
LM Studio
One quick download of this app, plus another of the LLama model - and I had a working model on my laptop. Jumped on the plane and played for 2 hours without internet. This thing knew how to create a business in my county and knew the names of towns, economic conditions - and wow!
I then generate haiku poems for my grandson who was born only hours before.
At this moment - I became aware that models were small enough to use - and that Larry was indeed smarter than me!
DiffusionBee
A few days later I grabbed DiffusionBee. Again, this has an interface which allowed me to see this “Langchain” type of flow to generate images.
ComfyUI
This morning I followed the blog at https://medium.com/@tchpnk/comfyui-on-apple-silicon-from-scratch-2024-58def01a3319 and was able to get this working locally.