Saturday, October 12, 2024
HomeTechGoogle turns woke, launches ‘prompts’ to suggest ‘politically correct’ words

Google turns woke, launches ‘prompts’ to suggest ‘politically correct’ words

Google has introduced ‘prompts’ that would inform the Google Workspace users about possible ‘politically correct’ phrases that they can use while writing a document.

The Search Engine giant Google has gone woke. It has introduced ‘prompts’ that would inform the Google Workspace users about possible ‘politically correct’ phrases that they can use while writing a document. The ‘inclusive language’ function would tell the users to change words like ‘chairman’ to ‘chairperson’ or ‘housewife’ to ‘stay-at-home spouse’. As per reports, it also suggests alternate for words like ‘landlord’ as ‘property owner’ or ‘proprietor’ as the word ‘landlord’ may not be ‘inclusive to all readers’.

Keeping the gender-specific terms in check, it suggests alternates for ‘policeman’ as ‘police officers. The word ‘manned’ suggests ‘crewed’ as an alternate. The feature is currently available for enterprise-level users. Though the computer document systems offer spelling and grammar checks, critics claim to force users to choose ‘woke language’ is an attempt to impose the gender norms followed by the company on its users.

The flaws in the system raised eyebrows

As per Dailymail, the system suggested changing President John F Kennedy’s inaugural address from ‘for all mankind’ to ‘for all humankind’. Interestingly, it suggests alternates for words like Blacklist and even Motherboard.

The system did not leave the ‘Son of God’ Jesus Christ. According to New York Post, it suggested that in the Sermon on the Mount, Jesus should have used ‘lovely’ instead of ‘marvellous’.

In a statement, Google said, “Potentially discriminatory or inappropriate language will be flagged, along with suggestions on how to make your writing more inclusive and appropriate for your audience.”

Speaking about the problems with the system, a Google spokesman said, “Assisted writing uses language understanding models, which rely on millions of common phrases and sentences to automatically learn how people communicate. This also means they can reflect some human cognitive biases. Our technology is always improving, and we don’t yet [have] a solution to identifying and mitigating all unwanted word associations and biases.”

Join OpIndia's official WhatsApp channel

  Support Us  

Whether NDTV or 'The Wire', they never have to worry about funds. In name of saving democracy, they get money from various sources. We need your support to fight them. Please contribute whatever you can afford

OpIndia Staff
OpIndia Staffhttps://www.opindia.com
Staff reporter at OpIndia

Related Articles

Trending now

Recently Popular

- Advertisement -