I think a big picture view makes the problem clearer.
Licensing material means that you must pay the owner of some intellectual property. If we expand copyright to require licensing for AI training, then that means that the owners can demand more money for no additional work.
Where does the wealth come from that flows to the owners? It comes from the people who work. There is nowhere else it could possibly come from.
That has some implications.
Research and development progress slower because, not only do we have to work on improving things, but also to pay off property owners who contribute nothing. If you zoom in from the big picture view, you find that this is where small devs and open source suffer. They have to pay or create their own, new datasets; extra work for no extra benefit.
It also means that inequality increases. The extra cash flow means that more income goes to certain property owners.
I understand. The idea would be to hold AI makers liable for contributory infringement, reminiscent of the Betamax case.
I don't think that would work in court. The argument is much weaker here than in the Betamax case, and even then it didn't convince. But yes, it's prudent to get the explicit permission, just in case of a case.