CNCF: Kubernetes is 'foundational' infrastructure for AI: https://thenewstack.io/cncf-kubernetes-is-foundational-infrastructure-for-ai/ via
@TheNewStack & @sjvn
The future of #AI depends on cloud-native computing and #Kubernetes.
CNCF: Kubernetes is 'foundational' infrastructure for AI: https://thenewstack.io/cncf-kubernetes-is-foundational-infrastructure-for-ai/ via
@TheNewStack & @sjvn
The future of #AI depends on cloud-native computing and #Kubernetes.
@sjvn @TheNewStack mind me in a few years Steve, but it all just reminds me with my own prediction that AI actually can kill cloud as we know it, by mastering cloud tech as we know today. And the ultimate future of AI (as well as electric efficiency in general) will be local.
To elaborate:
- I agree that all the peaked electric power and Nvidia’s computing power is for training not consumption
- consumption at scale is just regular (modern) cloud computing for horizontal scalability and we have it already as commodity in our data centers, all what is unusual that may be both expensive and complex in DC and surely expensive now for local-single/few users workload, is access to memory ( guess why I think won’t get cheaper any soon! )
- so if there is enough training to make these agents smart enough to lower the volume needed for training all what remains is commonly available cheap hardware for consumption and hell of fast ram and local network.
- we all know today that modern PaaS in the cloud is not serveless actions bound to a single vendor marketed component proposition, modern PaaS is whatever you could install at your home linux just in a container and cloud offers easier network and security engineering which is Ok
- but as the same tech can be on my Synology then all my NAS misses is not lack of .. [1/2]
… [2/2] capacity to hold the architecture but a bit more power and memory
- but I can have affordable servers and workstations that can do it today (still imperfectly but with potential ) so I imagine tommorow may be only better
- so of course if you don’t want to keep own electronics at office or home, there will be still affordable cloud ai available for common tasks…
But just with micro nuclear plants instead of massive installations to keep up with AI’s data centers, you will have mini mainframes and workstations with local AIs for enterprises with licensed/purchased models deployed there
And that will surely shut down this whole ecologic debate about AI… and AI will stay, even if OpenAI fails to sustain… it doesn’t matter.
We need to ship AI to spaceships so Mother will help us discover Aliens… that thing need to be moderately more mobile than today. If containers help to move AI around and expand. Great… we got one step further into maturing it…
@hollowone @TheNewStack "Micro nuclear plants." That's the immediate problem. The fastest we'll get these out commercially is five years.
@sjvn @TheNewStack so is local AI today. Can’t match speed and accuracy of cloud AI… but all that will evolve