london 6.29am monday 2015
quote: Last year, a White House report on “Big Data” cautioned that the “algorithmic decisions raise the specter of ‘redlining’ in the digital economy – the potential to discriminate against the most vulnerable classes of our society under the guise of neutral algorithms.”unquote.
this is in context to an article called the tiger mom tax. interesting read, which says even in poor areas, the asians there are quoted higher prices for the princeton review’s online SAT tutoring.
i first heard of it via this link propublica.org which gives more examples of discrimination pricing. worth clicking on it.
we are all aware that pricing is different if u shop in poor neighbourhoods than in rich ones. that is why i go across the river to vauxhall to shop in the tesco there, rather than in pimlico. there is an added factor, where smaller express convenience tesco shops are more expensive than the big tesco supermarkets. or why i go to brixton and shop there for my garlic and ginger and chilli. we all know that if we shop in kensington , or knightsbridge, prices will be higher. but we dont expect that to happen in an online store… but this article suggests dont be too sure of it. haha.
even now we are used to ryanair pricing, where prices are adjusted to how often you visit that site, or how many people are buying at that time. so looks like the trend is there, to use algorithms to change the prices according to many factors ,including sex, race, etc. even if they say they dont discriminate according to race, sex etc. it is just that their behaviour can influence the algorithm, and more men may do something that females dont, and the algorithm will use that to determine the price to charge. or in this case of tiger mom, asians are more likely to use the tutoring services and willing to pay more too.
how to combat that? no one is saying, because maybe there is no answer. this thing just cannot be won, and we just have to suck it up. it looks like you just cannot beat that algorithm. just by typing in your name will pick you out for discrimination. it seems if u type in an asian name, you get the asian prices. its gone beyond just your ZIP code showing wealthy areas that determine the prices. some wealthy areas got low pricing. it is the individuals there acting by buying, that skews the algorithm.
at least in brixton, i would like to think that i am not charged extra for looking chinese, but maybe that is because i have been shopping there for ages and i know the prices. maybe if i were a chinese tourists things might be different. who knows right? perhaps after all, there is this discrimination happening all the time in this world. it is subtle and maynot be noticed…
i got all this from a wordpress reader feed. in it, it gives some way of fighting back, via the law, fight it in court.
quote; Because disparate impact theory is results-oriented, it would seem to be a good way to challenge algorithmic bias in court. A plaintiff would only need to demonstrate bias in the results, without having to prove that a program was conceived with bias as its goal. But there is little legal precedent. Barocas and Selbst argue in their article that expanding disparate impact theory to challenge discriminatory data-mining in court “will be difficult technically, difficult legally, and difficult politically.” unquote.
Tags: algorithms, asians, poor, pricing, tiger moms