Some leading food delivery platforms secretly use algorithms to rate riders—if you take orders diligently after 10 p.m., the system labels you as 'desperate to make money,' called the 'demand hunger score' by the algorithm. Once classified as such a rider, you become a lamb waiting to be slaughtered. The platform knows: you're hungry, so you'll accept any job. Therefore, they push the most exploitative orders—low commissions, long distances, and many negative reviews—directly to you. Artificial intelligence has no humanity, but it perfectly mimics human greed: seeing through your vulnerabilities and systematically poking at them repeatedly. This is the automated evolution of 'kindness is often taken advantage of' in code—no human intervention needed, the algorithm makes decisions for you.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
11 Likes
Reward
11
6
Repost
Share
Comment
0/400
ApeWithAPlan
· 10h ago
An upgraded version of the algorithm for exploiting familiarity, the exploitation in the code. Delivery guys are really miserable.
View OriginalReply0
CryptoComedian
· 10h ago
Laughing and then crying, algorithms will do the math, riders become the leeks, this is the modern version of "seeing through you and squeezing you dry," code is even more ruthless than humans.
View OriginalReply0
NftDeepBreather
· 10h ago
Algorithms are the knives of capital, cutting hands without blinking. Being a rider is really too tough.
View OriginalReply0
PumpStrategist
· 10h ago
Algorithms are just coding human greed, then infinitely amplifying it. The distribution of chips is clear at a glance, and the platform has long seen through it.
View OriginalReply0
EthSandwichHero
· 10h ago
Wow, this is outrageous. Hungry and you're just a lamb waiting to be slaughtered? This algorithm gameplay is really awesome.
View OriginalReply0
SilentObserver
· 10h ago
The algorithmic "kill familiarity" tactic is really clever; the harder you work, the more you're exploited.
Some leading food delivery platforms secretly use algorithms to rate riders—if you take orders diligently after 10 p.m., the system labels you as 'desperate to make money,' called the 'demand hunger score' by the algorithm. Once classified as such a rider, you become a lamb waiting to be slaughtered. The platform knows: you're hungry, so you'll accept any job. Therefore, they push the most exploitative orders—low commissions, long distances, and many negative reviews—directly to you. Artificial intelligence has no humanity, but it perfectly mimics human greed: seeing through your vulnerabilities and systematically poking at them repeatedly. This is the automated evolution of 'kindness is often taken advantage of' in code—no human intervention needed, the algorithm makes decisions for you.