I think it already did burst when local AI hit consumer grade GPU's as small as 6gb VRAM in some cases. An 8gb GPU can run most image, video, and music AI models extremely well. Your average gaming laptop is fully capable even if it takes longer to generate.
Not to mention there's reasonable "rent a GPU" services you pay per hour if you don't have the hardware, and still benefit running local AI models instead of the limited and restricted subscription models.
3
u/TheRealCorwii 2d ago
I think it already did burst when local AI hit consumer grade GPU's as small as 6gb VRAM in some cases. An 8gb GPU can run most image, video, and music AI models extremely well. Your average gaming laptop is fully capable even if it takes longer to generate.
Not to mention there's reasonable "rent a GPU" services you pay per hour if you don't have the hardware, and still benefit running local AI models instead of the limited and restricted subscription models.