Opaque algorithms are creating an invisible cage for platform workers

I found a broader concern about the way platforms use algorithms to control participants. Platforms’ algorithms create an invisible cage for platform users, because workers have no way of reliably accessing how their data is being processed or used to control their success on the platform. As a result, the platform’s algorithm claims to “know” the workers better than they know themselves, yet in ways that are inaccessible to them.