AIUse StableDiffusion.cpp in server mode Things I want to do Use StableDiffusion.cpp in server mode. Advantages and disadvantages First, let's summariz...2026.01.122026.03.15AI
AIRunning LLM with image input locally (AMD GPU/CPU compatible) Things I want to do We will run LLM (chat AI) locally with image input using llama.cpp. This article uses Qwen2.5...2026.01.052026.03.15AI
AIRun SD3.5, rumored to be fast and high-quality, via command line (AMD GPU/CPU compatible). Things I want to do Run SD3.5-medium (stable-diffusion-3.5-medium) from the command line using stable-diffusion.cpp....2026.01.052026.03.15AI
AIRunning LLM locally (AMD GPU/CPU compatible) Things I want to do Run LLM (Chat AI) locally using llama.cpp. This article uses gemma, Google's local model. ...2026.01.052026.03.15AI
AIPerform image editing with Qwen-Image from the command line (AMD GPU/CPU compatible) Things I want to do Use stable-diffusion.cpp to perform Qwen-Image image editing from the command line. This imag...2025.12.222026.03.15AI
AIRun Qwen-Image from the command line (AMD GPU/CPU compatible) Things I want to do Run Qwen-Image from the command line using stable-diffusion.cpp. It can run on both AMD GPUs ...2025.12.202026.03.15AI
AIRun StableDiffusion from the command line (AMD GPU/CPU compatible) Things I want to do Run StableDiffusion from the command line using stable-diffusion.cpp. It can run on both AMD ...2025.12.202026.03.15AI
AILet’s try running Z-Image-Turbo on the CPU. Things I want to do I'll try using the Z-Image-Turbo image generation model on Alibaba in an environment without CUD...2025.12.192026.03.15AI
AIRecent locally usable image generation models Things I want to do I've compiled a list of recently released image generation models that can be used locally. I...2025.12.192026.03.15AI
AIComparison with the use of Gemini 2.5 Flash Image (nano-banana) Things I want to do We will use Gemini 2.5 Flash Image (nano-banana) and compare it with Gemini 2.5 Flash Image. ...2025.08.292026.03.15AI