TED Talks
How to keep human bias out of AI | Kriti Sharma
AI algorithms make important decisions about you all the time — like how much you should pay for car insurance or whether or not you get that job interview. But what happens when these machines are built with human bias coded into their systems? Technologist Kriti Sharma explores how the lack of diversity in tech…
Science & Technology
Break the Bad News Bubble (Part 2) | Angus Hervey | TED
In a quick talk, he shares three major updates of recent human progress on eradicating ancient diseases, establishing massive new ocean sanctuaries and transforming children’s rights. (This conversation was recorded on December 2, 2024.) If you love watching TED Talks like this one, become a TED Member to support our mission of spreading ideas: Follow…
People & Blogs
How to Protect Your Emotional Health During the Holidays | Guy Winch | TED
The end of the year is often a time to reflect and spend time with family — activities that may seem joyful or anxiety-inducing, depending on your circumstances. Psychologist Guy Winch offers actionable advice on how to manage your emotions with confidence during the holidays, from setting boundaries to healing heartache — above all reminding…
Science & Technology
The Greatest Show on Earth — for Kids Who Need It Most | Sahba Aminikia | TED
TED Fellow and composer Sahba Aminikia brings the healing power of dance, storytelling, music and performance to some of the most dangerous places on Earth. By celebrating children and their communities with beauty and joy, he shows how to cultivate hope, connection and love — even in conflict zones. “The ultimate power is in unity,”…
-
Science & Technology5 years ago
Nitya Subramanian: Products and Protocol
-
CNET5 years ago
Ways you can help Black Lives Matter movement (links, orgs, and more) 👈🏽
-
People & Blogs3 years ago
Sleep Expert Answers Questions From Twitter 💤 | Tech Support | WIRED
-
Wired6 years ago
How This Guy Became a World Champion Boomerang Thrower | WIRED
-
Wired6 years ago
Neuroscientist Explains ASMR’s Effects on the Brain & The Body | WIRED
-
Wired6 years ago
Why It’s Almost Impossible to Solve a Rubik’s Cube in Under 3 Seconds | WIRED
-
Wired6 years ago
Former FBI Agent Explains How to Read Body Language | Tradecraft | WIRED
-
CNET5 years ago
Surface Pro 7 review: Hello, old friend 🧙
Chuck
April 14, 2019 at 11:27 am
She has no clue about AI.
prettyhappy
April 14, 2019 at 11:33 am
What is AL ?? In German May be ??
mitkoogrozev
April 14, 2019 at 11:56 am
Since AI still can’t learn from scientific experiments and understand scientific writings , until the point where your whole societies are re-formed and engineered based on latest scientific understanding, exposing it to what we have today will inevitably make the AI ‘racist’, ‘sexist’ , ‘elitis and other such ‘biases’ because that’s what it can sample from today’s societies. This is what is currently ‘objective’, that’s how most of them are currently structured.
Benny
April 14, 2019 at 12:47 pm
Seems AI never been a human. Human can create anomaly, how about AI? It looks to me AI will only flow with the MSM.
Violet Fyxe
April 14, 2019 at 12:50 pm
You don’t need to worry about AI developing a bias if you program in bias to begin with. ??
Violet Fyxe
April 14, 2019 at 12:53 pm
*AI:* “Women are statistically more likely to buy pregnancy tests than men”
*Kriti Sharma:* “Wow, how sexist!”
Boss Lax316
April 14, 2019 at 12:55 pm
I swear to god if she’s a T-Series subscriber…
cutepinkbandanaman
April 14, 2019 at 2:58 pm
Get the author of “weapons of math destruction” to do a tedtalk, the examples used here were really meh.
René Gauthier
April 14, 2019 at 8:03 pm
this was great, thanks Kriti.
bow tie
April 14, 2019 at 8:30 pm
by not making ai that manages online interaction
Marké Mark
April 14, 2019 at 10:01 pm
Where business is involved, there will always be bias.
Farmhouse Productions
April 15, 2019 at 12:27 am
All the technology is nothing but manifestation of natural laws.The rulers know a lot about how human mind works they are working towards replacing human mind with AI.
She doesn’t even know what she is talking while quoting the example of a woman in Africa. Useful idiot .
Main aim of AI will be to control human mind and manipulate perceptions soon human mind will lose the ability to differentiate between right and wrong and get subjugated into perpetual subservience.
And that is when the elites will be successful .
justletmepostthis
April 17, 2019 at 7:49 am
It doesn’t have to be that way.
陈源宇
April 15, 2019 at 10:38 am
我初中时被混混同学欺凌!同学辱骂污蔑殴打我偷我钱、撕我书、放我车胎气、在我书包上抹洗发水、在网上冒充我侮辱其他同学,混混班主任要我不要报复混混同学,班主任还和同学一起污蔑我,并且多次打我扇我耳光!我信访教育局和公安局,教育局公安局说教师没有打我污蔑我!我高中时候师生诬赖我带刀子在学校,不让我在学校住宿!教育局说有人证明我带刀子!我两次被警察关进梅州市第三人民医院,我被医护辱骂殴打电击,把我绑在床上不让我拉屎拉尿!中共只不过是一群1w9、1w2、2w1、2w3、3w2、3w4、6w5、6w7、7w6、7w8、8w7、8w9、9w8、9w1的骗子黑帮!
Tommy Kiddler
April 15, 2019 at 11:00 am
There are tons of problems AI should think to.
Husam Starxin
April 15, 2019 at 11:57 am
I’m sorry, but this is by far, one of the dumbest TED talks I’ve ever seen. And take it from a commuter science graduate, she has No idea what she’s talking about when it comes to ML and AI
Cephalic Miasma
April 15, 2019 at 3:58 pm
Entire argument is based on a flawed assumption – that there are no distinctions between racial and gender groups (whether they are inherent or the result of socioeconomic factors) and that all statements regarding any differences are inherently prejudiced. This needs to be shown first, you cannot merely assert this.
Isedorgamlit
April 15, 2019 at 5:05 pm
wow this sets a new low bar – now I could give Ted talks too, it seems.
John Farris
April 16, 2019 at 1:59 am
I think it’s funny that they think they know what I want. When due to boredom what I want changes every day.
Kevin Reardon
April 16, 2019 at 4:01 am
What we need is less dickheads in AI
Joxman2k
April 16, 2019 at 5:15 am
I didn’t hear anything about “How” to keep human bias out of AI, just that there is bias. I think this has more to do with machine learning than actual AI. Many viewpoints can be part of an AI algorithm, but being neutral should be the goal. I’m not sure how her expressing apparent male centric developers bias as being bad and her more woman centric bias as being correct is supposed to balance that out. I mean exchanging one bias for another is not keeping out human bias. She does bring up an important topic, but it is more about awareness than it is about solving it.
krishna punyakoti
April 17, 2019 at 4:44 am
Mark Zuckerberg or Elon Musk didnt look like now Mark Zuckerberg or Elon Musk when they had to try hard cracking their first deals, building trust, struggling to make things happen.
There seems to a lot of bias going in this talk than in the AI or in the stats. Lets build an AI to screen military job applications by introducing 50-50 gender bias and see how it goes. If we are introducing 50-50 its not taking out bias, its adding more bias coz its totally deviant from the ground truth.
Ayala Crescent The Shield Abode
April 17, 2019 at 7:46 am
If a machine say so… I can do dishes.
justletmepostthis
April 17, 2019 at 7:47 am
“Don’t worry, she will never be on the internet”…She IS the internet…Funny how these people (Elitist mindset) will lie, just to get what they want, without any regard to anyone but themselves.
Dr Peter jones
April 17, 2019 at 2:50 pm
This is not about Ai at all this is about discrimination.
This is more frightening than you think. So a female goes into a railway station and asks for a ticket from A to B. They are given the same cost ticket as a man. A blind person goes into the same railway station to ask for the same ticket and is charged twice the amount even though the law is very clear that blind person should be charged the lowest fare possible of all tickets regardless of gender.
Why is this ?
Ansawer the algorythm written by a person discriminates against the blind person because it was written by a non blind person. Its a case of a process designed by the sighted for the sighted which excludes the non sighted even though the law has been broken.
The only way forward is yearly testing of machines to certify they are disabled tested before they can be used by the public. Simply the machine tries to cement existing predujices against groups and are not fit for purpose.
Mr Doodleydoo
April 17, 2019 at 9:30 pm
If the groups involved changed the statistics about themselves, the algorithms would reflect that. If White/Asian people suddenly became the least likely to repay loans, the algorithms would reflect that.