It's easy: we have reached AGI when there are zero jobs left. Or at least non manual labor jobs. If there is a single non-physical job left, then that means that person must be doing something that AI can't, so by definition, it's not AGI.
I think it'll be a steep sigmoid function. For a long time it'll be a productivity booster, but not enough "common sense" to replace people. We'll all laugh about how silly it was to worry about AI taking our jobs. Then some AI model will finally get over that last hump, maybe 10 or 20 years from now (or 1000, or 2}, and it will be only a couple months before everything collapses.
I dislike your definition. There are many problems besides economic ones. If you defined "general" to mean "things the economy cares about", then what do you call the sorts of intelligences that are capable of things that the economically relevant ones are not?
A specific key opens a subset of locks, a general key would open all locks. General intelligence, then, can solve all solvable problems. It's rather arrogant to suppose that humans have it ourselves or that we can create something that does.
It also partitions jobs into physical and intellectual aspects alone. Lots of jobs have a huge emotional/relational/empathetic components too. A teacher could get by being purely intellectual, but the really great ones have motivational/inspirational/caring aspects that an AI never could. Even if an AI says the exact same things, it doesn't have the same effect because everyone knows it's just an algorithm.
And most people get by on those jobs by faking the emotional component, at least some of the time. AGI presumably can fake perfectly and never burn out.
Have a long talk with any working teacher or therapist. If you think the regular workload is adequate for them to offer enough genuine emotional support for all the people they work with, always, everyday, regardless of their personal circumstances, you're mistaken. Or the person you're talking with is incredibly lucky.
It doesn't have to be much, or intentional, or even good for that matter. My kids practice piano because they don't want to let their teacher down. (Well, one does. The other is made to practice because WE don't want to let the teacher down).
If the teacher was a robot, I don't think the piano would get practiced.
IDK how AI gains that ability. The requirement is basically "being human". And it seems like there's always going to be a need for humans in that space, no matter how smart AI gets.
I think it'll be a steep sigmoid function. For a long time it'll be a productivity booster, but not enough "common sense" to replace people. We'll all laugh about how silly it was to worry about AI taking our jobs. Then some AI model will finally get over that last hump, maybe 10 or 20 years from now (or 1000, or 2}, and it will be only a couple months before everything collapses.