Humanity is far from ideal truth and information added. AI is likely to learn and implement extremely swiftly design choices dependent on less-than-ideal truth. Consider implications of carrying through less-than-ideal medical research as truth.
There are time requirements to talk to and address concerns of others working on complex projects. There is high potential for people setting projects others are working on to value out, will hurt rather than help – specifically those projects that deliver less than ideally on social concerns like Diversity and Reduced Sexism. A case could be made that while any oppression is allowed in society all projects deliver on those less than sufficiently. Derail competition versus validate and work through problems (what would be required for Golden Rule) could become common practice and more enabled via Artificial Intelligence.
- Time Requirements and Competition are less than all value in is less than ideal ground to stand on for the delivery of better for humanity, at least as has been shown via precedent in human history
Potential for ability to highlight legal details that could sink companies in a way that leads to less support for society without a requirement to replace the same support those companies provide. Greater details to attack do not necessarily equal sufficient capacity to support.
Throughput over righting wrongs of history has shown precedent in humanity. Pyramids destroyed for investments in slavery? Roman Colosseum destroyed for investments in barbarism? Direction pre-World War I and World War 2 might have drawn on less than ideal precedent in history one might hypothesize.