For more most applications, GC is a really nice idea. I don't really understand the aversion to it when it is done well and not wastefully. I suspect many people were slightly traumatized by bad early experiences with slow, inefficient GC and did not recover.
But one use case where there GC can render a language wholly inappropriate as it is typically implemented are real-time systems where tight timing is a requirement. Even multithreaded GC languages still do not provide fine enough control over which threads should be garbage-collected and which should not. This is actually important, but most GC language authors carry the mentality that you should not be trusted to decide memory matters ever, just extending many of the same micromanagement mistakes of the UNIX paradigm. So C endures especially in the embedded space, because at least it does not overtly try to get in your way and retains the most of the efficiency of a proper low-level language, whereas most languages with GC target much higher levels of abstraction and low-level work isn't a serious consideration.