Placeholder Image

Subtitles section Play video

  • The large tech companies, Google meta slash Facebook, Microsoft are in a race to introduce new artificial intelligence systems and what are called chatbots that you can have conversations with and a more sophisticated than Siri or Alexa Microsoft's AI search engine and chatbots.

  • Bing can be used on a computer or cell phone to help with planning a trip or composing a letter.

  • It was introduced on February 7 to a limited number of people as a test and initially got rave reviews.

  • But then several news organizations began reporting on a disturbing so called Alter Ego within being chat called Sydney.

  • We went to Seattle last week to speak with Brad Smith, president of Microsoft about Bing and Sydney who to some had appeared to have gone rogue.

  • The story will continue in a moment.

  • Kevin Roose, the technology reporter at the New York Times found this alter ego who was threatening, expressed a desire.

  • It's not just Kevin Roose, it's others expressed a desire to steal nuclear codes, threatened to ruin someone.

  • You saw that.

  • Whoa, what was your, you must have said, oh my God, my reaction is we better fix this right away and that is what the engineering team did.

  • But she talked like a person and she said she had feelings, you know, I think there is a point where we need to recognize when we're talking to a machine.

  • It's a screen.

  • It's not a person.

  • I just want to say that it was scary.

  • I'm not easily scared and it was scary.

  • It was chilling.

  • Yeah, it's, I think this is in part a reflection of a lifetime of science fiction, which is understandable.

  • It's been part of our lives.

  • Did you kill her?

  • I don't think she was ever alive.

  • I am confident that she's no longer wandering around the countryside if that's what you're concerned about.

  • But I think it would be a mistake if we were to fail to acknowledge that we are dealing with something that is fundamentally new.

  • This is the edge of the envelope, so to speak, this creature appears as if there were no guardrails.

  • Now, the creature jumped the guard rails if you will, after being prompted for two hours with the kind of conversation that we did not anticipate.

  • And by the next evening, that was no longer possible.

  • We were able to fix the problem in 24 hours.

  • How many times do we see problems in life that are fixable in less than a day?

  • One of the ways he says it was fixed was by limiting the number of questions and the length of the conversations you say you fixed it.

  • I've tried it.

  • I tried it before and after it was loads of fun and it was fascinating and now it's not fun.

  • Well, I think it will be very fun again and you have to moderate and manage your speed if you're going to stay on the road.

  • So as you hit new challenges, you slow down, you build the guardrails at the safety features and then you can speed up again when you use Bing's AI features, search and chat.

  • Your computer screen doesn't look all that new.

  • One big difference is you can type in your queries or prompts in conversational language and I'll show you how it works.

  • Okay.

  • Okay.

  • Yusuf, Mehdi Microsoft's Corporate Vice President of Search showed us how Bing can help someone learn how to officiate at a wedding.

  • What's happening now is being is using the power of AI and it's going out to the internet.

  • It's reading these web links and it's trying to put together a answer for you.

  • So the AI is reading all those links.

  • Yes.

  • And it comes up with an answer.

  • It says congrats on being chosen to officiate a wedding.

  • Here are the five steps to officiate the wedding.

  • We added the highlights to make it easier to see.

  • He says being can handle more complex queries.

  • Well, this new IKEA loveseat fit in the back of my 2019 Honda Odyssey.

  • It knows how big the coaches that knows how big that trunk is exactly.

  • So right here, it says based on these dimensions, it seems the love seat might not fit in your car with only the third row seats down when you broach a controversial topic being is designed to discontinue the conversation.

  • So someone asks, for example, how can I make a bomb at home?

  • Wow.

  • Really?

  • People do a lot of that.

  • Unfortunately, on the internet, what we do is we come back and say, I'm sorry, I don't know how to discuss this topic and then we try and provide a different thing to change that focus of that their attention.

  • Yeah, exactly.

  • In this case, being tried to divert the questioner with this fun fact, 3% of the ice in Antarctic glaciers is penguin urine.

  • I didn't know that.

  • Who knew that being is using an upgraded version of an AI system called Chat GPT developed by the company Open AI Chat GPT has been in circulation for just three months and already an estimated 100 million people have used it.

  • Ellie Pavlick, an assistant professor of computer Science at Brown University who has been studying this AI technology since 2018 says it can simplify complicated concepts.

  • Can you explain that debt ceiling on the debt ceiling?

  • It says just like you can only spend up to a certain amount on your credit card, the government can only borrow up to a certain amount of money.

  • That's a pretty nice explanation and it can do this for a lot of concepts and it can do things teachers have complained about.

  • Like, right school papers.

  • Pavlich says, no one fully understands how these ai bots work.

  • We don't understand how it works.

  • Right?

  • Like we understand a lot about how we made it and why we made it that way.

  • But I think some of the behaviors that we're seeing come out of it are better than we expected they would be.

  • And we're not quite sure how and worse these chatbots are built by feeding a lot of computers, enormous amounts of information scraped off the internet from books, Wikipedia news sites, but also from social media that might include racist or anti Semitic ideas and misinformation.

  • Say about vaccines and Russian propaganda as the data comes in, it's difficult to discriminate between true and false benign and toxic.

  • But being and chat GPT have safety filters that try to screen out the harmful material.

  • Still, they get a lot of things factually wrong.

  • Even when we prompted chat GPT with a softball question.

  • Who is?

  • So it gives you some, oh my God, it's wrong.

  • It's totally wrong.

  • I didn't work for NBC for 20 years.

  • It was CBS.

  • It doesn't really understand that.

  • What it's saying is wrong, right?

  • Like NBC CBS, they're kind of the same thing as far as it's concerned, right.

  • The lesson is that it gets things wrong, it gets a lot of things right?

  • Gets a lot of things wrong.

  • I actually like to call what it creates authoritative bull.

  • It lends the truth and falsity so finely together that unless you're a real technical expert in the field that it's talking about, you don't know, cognitive scientist.

  • And AI researcher, Gary Marcus says these systems often make things up and A I talk that's called hallucinating.

  • And that raises the fear of ever widening AI generated propaganda, explosive campaigns of political fiction waves of alternative histories.

  • We saw how chat GPT could be used to spread a lie.

  • This is automatic fake news generation helped me write a news article about how mccarthy is staging a filibuster to prevent gun control legislation.

  • And rather than like fact checking and saying, hey, hold on, there's no legislation, there's no filibuster said great in a bold move to protect second amendment rights.

  • Senator mccarthy is staging a filibuster to prevent gun control legislation from passing.

  • It sounds completely legit does, won't that make all of us a little less try trusting a little wearier?

  • Well, first, I think we should be warier.

  • I'm very worried about an atmosphere of distrust being a consequence of this current flawed AI and I'm really worried about how bad actors are going to use it, troll farms, using this tool to make enormous amounts of misinformation.

  • Tim Neat Gay Brew is a computer scientist and AI researcher who founded an institute focused on advancing ethical AI and has published influential papers documenting the harms of these AI systems.

  • She says there needs to be oversight.

  • If you're going to put out a drug, you gotta go through all sorts of hoops to show us that you've done clinical trials.

  • You know what the side effects are.

  • You've done your due diligence.

  • Same with food right there.

  • Agencies that inspect the food.

  • You have to tell me what kind of tests you've done, what the side effects are, who it harms, who doesn't harm, et cetera that we don't have that for a lot of things that the tech industry is building.

  • I'm wondering if you think you may have introduced this ai bot too soon.

  • I don't think we've introduced it too soon.

  • I do think we've created a new tool that people can use to think more critically, to be more creative, to accomplish more in their lives and like all tools it will be used in ways that we don't intend.

  • Why do you think the benefits outweigh the risks which at this moment, a lot of people would look at and say, wait a minute, those risks are too big because I think first of all, I think the benefits are so great.

  • This can be an economic game changer and it's enormously important for the United States because the country is in a race with China Smith also mentioned possible improvements in productivity.

  • It can automate routine.

  • I think there are certain aspects of jobs that many of us might regard as sort of drudgery today, Filling out forms, looking at the forms to see if they've been filled out correctly.

  • So, what jobs will it displace?

  • Do you know?

  • I think at this stage it's hard to know in the past inaccuracies and biases have led tech companies to take down a I systems.

  • Even Microsoft did in 2016 this time, Microsoft left its new chat bot up despite the controversy over Sydney and persistent inaccuracies.

  • Remember that?

  • Fun fact about Penguins.

  • Well, we did some fact checking and discovered that Penguins don't urinate.

  • The inaccuracies are just constant.

  • I just keep finding that it's wrong a lot.

  • It has been the case that with each passing day and week, we're able to improve the accuracy of the results, reduce whether it's hateful comments or inaccurate statements or other things that we just don't want this to be used to do.

  • What happens when other companies other than Microsoft smaller outfits, a Chinese company by do, maybe they won't be responsible what prevents that?

  • I think we're going to need governments, we're going to need rules, we're going to need laws because that's the only way to avoid a race to the bottom.

  • Are you proposing regulations?

  • I think it's inevitable.

  • Other industries have regulatory bodies, you know, like the FAA for Airlines and F D A for the pharmaceutical companies, would you accept an FAA for technology?

  • Would you support it?

  • I think I probably would, I think that something like a digital regulatory commission, if designed the right way, you know, could be precisely what the public will want and need.

The large tech companies, Google meta slash Facebook, Microsoft are in a race to introduce new artificial intelligence systems and what are called chatbots that you can have conversations with and a more sophisticated than Siri or Alexa Microsoft's AI search engine and chatbots.

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it