in

Of Brides and Google’s attempts to culturally sensitise AI

You reap what you sow.

Credit: Google AI Blog

Can AI be culturally biased? Well, it already is.

Due to reasons both historic and economic, many of the datasets available for object recognition mainly feature images from the West, which leads to machine learning that reflects the bias inherent in them.

For example, current image recognition systems are likely to recognize a scene and objects of a Western wedding than one from an eastern culture with entities that it hasn’t been sufficiently trained to ‘see’, such as a wedding bride in a saree or, perhaps, a marriage homa.

To counter this, Google launched the Inclusive Images Competition which attempted to train the machine learning algorithms themselves to be more inclusive. And while the effort was only partly successful (only one of the five finalists recognized an Indian bride), Google AI reportedly plans to release a diverse 500,000-image dataset on December 7.