I'm starting to call the GPU just "General Processing Unit" because honestly at this point It's just a CPU with a less intuitive (yet faster) multicore approach.
Post
@meluzzy imo the CPU can be less intuitive in many ways due to the multi-level caching architecture. the individual cores of a GPU can be easier to understand even if efficiently programming for it can be a trip
@hipsterelectron people that convert huge codecs into shaders (not even using OpenCL or CUDA, are absolute gods and they scare me)
@meluzzy cpu having lower-latency interconnect to i/o hardware makes it more useful for a lot of stuff i wanna do. idk if that's an intrinsic problem though
@hipsterelectron I mean the GPU was always meant to work as a server component not as a directly accesible component.
@meluzzy yeah but WHAT IF
@hipsterelectron We can only dream of the power that could unleash.
Wouldn't be to crazy to believe GPUs will end up replacing Workstation CPUs at some point. After all most of the cursed AI datacenter racks have a CPU just because they need that to operate.
@meluzzy i am fascinated by asynchronous processors as well
@hipsterelectron A 12 core clockless ARM CPU now thats the holy goat
@hipsterelectron CPU when chill: 500MHz 
CPU when ANGY: 5GHz 
@meluzzy it's impossible to overclock this cpu! [crowd boos, jeering]
because you can set it to arbitrary frequencies!!!! [crowd screams, throwing bras on stage, starts moshing]
@meluzzy remember when they called it GPGPU