TED Talks

How to keep human bias out of AI | Kriti Sharma

AI algorithms make important decisions about you all the time — like how much you should pay for car insurance or whether or not you get that job interview. But what happens when these machines are built with human bias coded into their systems? Technologist Kriti Sharma explores how the lack of diversity in tech…

Published

on

AI algorithms make important decisions about you all the time — like how much you should pay for car insurance or whether or not you get that job interview. But what happens when these machines are built with human bias coded into their systems? Technologist Kriti Sharma explores how the lack of diversity in tech is creeping into our AI, offering three ways we can start making more ethical algorithms.

Get TED Talks recommended just for you! Learn more at .

The TED Talks channel features the best talks and performances from the TED Conference, where the world’s leading thinkers and doers give the talk of their lives in 18 minutes (or less). Look for talks on Technology, Entertainment and Design — plus science, business, global issues, the arts and more.

26 Comments

  1. Chuck

    April 14, 2019 at 11:27 am

    She has no clue about AI.

  2. prettyhappy

    April 14, 2019 at 11:33 am

    What is AL ?? In German May be ??

  3. mitkoogrozev

    April 14, 2019 at 11:56 am

    Since AI still can’t learn from scientific experiments and understand scientific writings , until the point where your whole societies are re-formed and engineered based on latest scientific understanding, exposing it to what we have today will inevitably make the AI ‘racist’, ‘sexist’ , ‘elitis and other such ‘biases’ because that’s what it can sample from today’s societies. This is what is currently ‘objective’, that’s how most of them are currently structured.

  4. Benny

    April 14, 2019 at 12:47 pm

    Seems AI never been a human. Human can create anomaly, how about AI? It looks to me AI will only flow with the MSM.

  5. Violet Fyxe

    April 14, 2019 at 12:50 pm

    You don’t need to worry about AI developing a bias if you program in bias to begin with. ??

  6. Violet Fyxe

    April 14, 2019 at 12:53 pm

    *AI:* “Women are statistically more likely to buy pregnancy tests than men”
    *Kriti Sharma:* “Wow, how sexist!”

  7. Boss Lax316

    April 14, 2019 at 12:55 pm

    I swear to god if she’s a T-Series subscriber…

  8. cutepinkbandanaman

    April 14, 2019 at 2:58 pm

    Get the author of “weapons of math destruction” to do a tedtalk, the examples used here were really meh.

  9. René Gauthier

    April 14, 2019 at 8:03 pm

    this was great, thanks Kriti.

  10. bow tie

    April 14, 2019 at 8:30 pm

    by not making ai that manages online interaction

  11. Marké Mark

    April 14, 2019 at 10:01 pm

    Where business is involved, there will always be bias.

  12. Farmhouse Productions

    April 15, 2019 at 12:27 am

    All the technology is nothing but manifestation of natural laws.The rulers know a lot about how human mind works they are working towards replacing human mind with AI.
    She doesn’t even know what she is talking while quoting the example of a woman in Africa. Useful idiot .
    Main aim of AI will be to control human mind and manipulate perceptions soon human mind will lose the ability to differentiate between right and wrong and get subjugated into perpetual subservience.
    And that is when the elites will be successful .

    • justletmepostthis

      April 17, 2019 at 7:49 am

      It doesn’t have to be that way.

  13. 陈源宇

    April 15, 2019 at 10:38 am

    我初中时被混混同学欺凌!同学辱骂污蔑殴打我偷我钱、撕我书、放我车胎气、在我书包上抹洗发水、在网上冒充我侮辱其他同学,混混班主任要我不要报复混混同学,班主任还和同学一起污蔑我,并且多次打我扇我耳光!我信访教育局和公安局,教育局公安局说教师没有打我污蔑我!我高中时候师生诬赖我带刀子在学校,不让我在学校住宿!教育局说有人证明我带刀子!我两次被警察关进梅州市第三人民医院,我被医护辱骂殴打电击,把我绑在床上不让我拉屎拉尿!中共只不过是一群1w9、1w2、2w1、2w3、3w2、3w4、6w5、6w7、7w6、7w8、8w7、8w9、9w8、9w1的骗子黑帮!

  14. Tommy Kiddler

    April 15, 2019 at 11:00 am

    There are tons of problems AI should think to.

  15. Husam Starxin

    April 15, 2019 at 11:57 am

    I’m sorry, but this is by far, one of the dumbest TED talks I’ve ever seen. And take it from a commuter science graduate, she has No idea what she’s talking about when it comes to ML and AI

  16. Cephalic Miasma

    April 15, 2019 at 3:58 pm

    Entire argument is based on a flawed assumption – that there are no distinctions between racial and gender groups (whether they are inherent or the result of socioeconomic factors) and that all statements regarding any differences are inherently prejudiced. This needs to be shown first, you cannot merely assert this.

  17. Isedorgamlit

    April 15, 2019 at 5:05 pm

    wow this sets a new low bar – now I could give Ted talks too, it seems.

  18. John Farris

    April 16, 2019 at 1:59 am

    I think it’s funny that they think they know what I want. When due to boredom what I want changes every day.

  19. Kevin Reardon

    April 16, 2019 at 4:01 am

    What we need is less dickheads in AI

  20. Joxman2k

    April 16, 2019 at 5:15 am

    I didn’t hear anything about “How” to keep human bias out of AI, just that there is bias. I think this has more to do with machine learning than actual AI. Many viewpoints can be part of an AI algorithm, but being neutral should be the goal. I’m not sure how her expressing apparent male centric developers bias as being bad and her more woman centric bias as being correct is supposed to balance that out. I mean exchanging one bias for another is not keeping out human bias. She does bring up an important topic, but it is more about awareness than it is about solving it.

  21. krishna punyakoti

    April 17, 2019 at 4:44 am

    Mark Zuckerberg or Elon Musk didnt look like now Mark Zuckerberg or Elon Musk when they had to try hard cracking their first deals, building trust, struggling to make things happen.
    There seems to a lot of bias going in this talk than in the AI or in the stats. Lets build an AI to screen military job applications by introducing 50-50 gender bias and see how it goes. If we are introducing 50-50 its not taking out bias, its adding more bias coz its totally deviant from the ground truth.

  22. Ayala Crescent The Shield Abode

    April 17, 2019 at 7:46 am

    If a machine say so… I can do dishes.

  23. justletmepostthis

    April 17, 2019 at 7:47 am

    “Don’t worry, she will never be on the internet”…She IS the internet…Funny how these people (Elitist mindset) will lie, just to get what they want, without any regard to anyone but themselves.

  24. Dr Peter jones

    April 17, 2019 at 2:50 pm

    This is not about Ai at all this is about discrimination.

    This is more frightening than you think. So a female goes into a railway station and asks for a ticket from A to B. They are given the same cost ticket as a man. A blind person goes into the same railway station to ask for the same ticket and is charged twice the amount even though the law is very clear that blind person should be charged the lowest fare possible of all tickets regardless of gender.

    Why is this ?
    Ansawer the algorythm written by a person discriminates against the blind person because it was written by a non blind person. Its a case of a process designed by the sighted for the sighted which excludes the non sighted even though the law has been broken.

    The only way forward is yearly testing of machines to certify they are disabled tested before they can be used by the public. Simply the machine tries to cement existing predujices against groups and are not fit for purpose.

  25. Mr Doodleydoo

    April 17, 2019 at 9:30 pm

    If the groups involved changed the statistics about themselves, the algorithms would reflect that. If White/Asian people suddenly became the least likely to repay loans, the algorithms would reflect that.

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version