On Google, nearly 50% of political mentions in response to neutral questions refer to the radical right. On ChatGPT, it is around 30%. If we know a newspaper has an editorial line, we activate critical filters. The simple biased ordering of results can shift the electoral preferences of undecided voters by more than 20%. And the problem is not just the opacity, it is that it is read as neutrality. If we accept that politics depends increasingly on the algorithm, scrutiny must keep pace with this change.
Algorithm, a major biased political editor
Thursday, 16 April 2026RSS



