if you ramp up the production of information, you need better discernment, because your capacity limitations haven’t just magically increased
without better discernment things will break *even if* the quality of the additional information is high ….
it’s not just “AI slop” that’s a problem, just helping regular scientists to produce more is damaging if we don’t simultaneously provide better tools for discernment (human and technical) - but if anything, we are weakening those (e,g. by de-skilling)….
that’s a disastrous feedback loop in the making…