Transcript
1
00:00:00,360 --> 00:00:02,305
Speaker 1: Welcome to the
Healthy, wealthy and Wise
2
00:00:02,444 --> 00:00:07,442
podcast with Dr William Choctaw,
MD. JD. In this episode, Dr.
3
00:00:07,482 --> 00:00:10,807
Choctaw will continue his
Leadership Masterclass series by
4
00:00:10,807 --> 00:00:16,202
covering how AI chatbots
improve healthcare. One of the
5
00:00:16,242 --> 00:00:20,312
many ways AI has manifested in
healthcare is through chatbots,
6
00:00:20,881 --> 00:00:24,628
which are computer programs
designed to engage with human
7
00:00:24,788 --> 00:00:29,301
users in natural language
conversations. Dr Choctaw will
8
00:00:29,361 --> 00:00:34,929
dive into this aspect, as well
as share AI chatbots' ability to
9
00:00:34,929 --> 00:00:39,121
directly assist the medical
community in various ways. Dr
10
00:00:39,161 --> 00:00:43,246
Choctaw will also take a look at
some of the tech companies that
11
00:00:43,246 --> 00:00:47,173
are major players in this space
, with large investments and
12
00:00:47,232 --> 00:00:52,045
chatbots of their own. Dr
Choctaw will address the cheers
13
00:00:52,365 --> 00:00:55,832
and the fears regarding this
transformative technology
14
00:00:56,780 --> 00:01:01,331
Question. Do you know what a
DaVinci robot is or what it does
15
00:01:01,331 --> 00:01:04,906
? Well, these and other
questions will be answered and
16
00:01:04,947 --> 00:01:08,763
addressed in this podcast
episode. So let's get started.
17
00:01:09,584 --> 00:01:12,432
Here's Dr William Choctaw, md JD
.
18
00:01:13,561 --> 00:01:16,924
Speaker 2: We're going to talk
about how artificial,
19
00:01:17,045 --> 00:01:29,465
intelligent chatbots improve
healthcare. So buckle up, stay
20
00:01:29,504 --> 00:01:34,463
with me and here we go. I
believe the life is about being
21
00:01:34,504 --> 00:01:39,421
of service to others. I believe
knowledge is power. I believe
22
00:01:39,522 --> 00:01:43,688
leaders can change the world. As
always, when we do the
23
00:01:43,707 --> 00:01:48,900
masterclass, we talk about
medicine specifically and then,
24
00:01:48,980 --> 00:01:53,421
on top of that, we'll talk about
law or legal issues. Today,
25
00:01:53,441 --> 00:01:56,126
we're going to talk about
medicine as it interacts with
26
00:01:56,266 --> 00:02:01,524
artificial intelligence and
finances, and we'll delineate
27
00:02:01,563 --> 00:02:06,222
that more as we go along. So
stay with us. As always, i like
28
00:02:06,263 --> 00:02:08,929
to include an outline. The
purpose of the outline is to let
29
00:02:08,929 --> 00:02:12,043
you know what I'm going to talk
about. I'm going to talk about
30
00:02:13,165 --> 00:02:15,610
artificial intelligence
specifically. We're going to
31
00:02:15,651 --> 00:02:18,544
define it. We believe in
definitions. We're going to
32
00:02:18,585 --> 00:02:23,022
define what a chatbot is. We're
going to define some of the
33
00:02:23,082 --> 00:02:28,241
different iterations of AI and
the chatbot. We're going to
34
00:02:28,281 --> 00:02:31,645
define reasons for issues. We're
going to talk about some of the
35
00:02:31,645 --> 00:02:35,072
problems and fears with
chatbots that individuals are
36
00:02:35,111 --> 00:02:39,243
going through presently, and
we're going to talk about robots
37
00:02:39,243 --> 00:02:43,301
and surgery. Yes, there are
robots in surgery, don't worry.
38
00:02:43,361 --> 00:02:48,361
We'll explain it when we get
there. I don't want to scare you
39
00:02:48,361 --> 00:02:52,591
. Don't scare you, but they are
there. But that's a good thing.
40
00:02:52,711 --> 00:02:58,588
That's a good thing. Okay, so
what is artificial intelligence?
41
00:02:58,588 --> 00:03:01,735
I know you've heard the term
for many, many years and I know
42
00:03:01,775 --> 00:03:05,100
that you have an idea about what
artificial intelligence is. But
43
00:03:05,100 --> 00:03:08,810
what do we mean when we say
artificial intelligence? Let me
44
00:03:08,849 --> 00:03:13,787
give you a definition The
simulation or approximation of
45
00:03:13,927 --> 00:03:21,493
human intelligence in machines.
Let me give you my definition
46
00:03:21,554 --> 00:03:25,725
Teaching machines to think.
Artificial intelligence is
47
00:03:25,746 --> 00:03:31,949
teaching machines to think. Now
you can either be excited about
48
00:03:31,969 --> 00:03:37,046
that or that could scare you,
and I submit that it's probably
49
00:03:37,067 --> 00:03:39,514
a little of both. You know that
one goes through. I know it's
50
00:03:39,554 --> 00:03:45,883
probably a little of both that I
go through, but this is what
51
00:03:45,943 --> 00:03:50,328
artificial intelligence is, and
I can tell you that if you're
52
00:03:50,407 --> 00:03:54,352
finding out or listening now
about artificial intelligence,
53
00:03:54,853 --> 00:04:01,282
it's already probably at least
the last five or 10 years. So
54
00:04:01,302 --> 00:04:05,268
this is Dr Chalk-Tom brain
surgeon, heart surgeon, that are
55
00:04:05,268 --> 00:04:09,014
both exciting and a little
traumatizing at the same time.
56
00:04:10,620 --> 00:04:14,706
What is the goal of artificial
intelligence? Well, the goal is
57
00:04:14,826 --> 00:04:19,880
computer enhanced learning,
reasoning and perception. Think
58
00:04:19,920 --> 00:04:25,411
about that Computer enhanced
learning, reasoning and
59
00:04:25,492 --> 00:04:30,528
perception. Think about a
computer that can reason. Think
60
00:04:30,567 --> 00:04:36,345
about a computer that has
perception. Okay, and I remember
61
00:04:36,345 --> 00:04:40,372
we always say perception is a
reality. You know, certainly in
62
00:04:40,432 --> 00:04:43,624
politics that's true. Whatever
people perceive you are
63
00:04:43,704 --> 00:04:46,468
politically and in terms of how
they're going to vote, is
64
00:04:46,509 --> 00:04:49,940
probably how you really are from
their perspective. But think
65
00:04:49,980 --> 00:04:53,600
about that. There are types of
artificial intelligence. I'm
66
00:04:53,620 --> 00:04:56,827
just going to give you the two
broad categories. That's basic
67
00:04:56,928 --> 00:05:01,564
pattern recognition. Now, basic
pattern recognition is where you
68
00:05:01,564 --> 00:05:07,273
give a computer or machine a
certain set of data and you keep
69
00:05:07,273 --> 00:05:10,197
giving it the same amount over
and over and over again, and it
70
00:05:10,298 --> 00:05:13,564
learns by repetition. It keeps
repeating it and repeating it
71
00:05:13,584 --> 00:05:17,120
and repeating it until it has it
memorized. And then there's the
72
00:05:17,120 --> 00:05:20,906
other part of artificial
intelligence, or thinking
73
00:05:20,987 --> 00:05:27,182
machines, if you will, complex
human emotion. And I submit that
74
00:05:27,182 --> 00:05:30,949
this is where we get into the
discomfort that we humans have,
75
00:05:31,029 --> 00:05:34,242
and I would argue that we should
have, and we'll talk about this
76
00:05:34,242 --> 00:05:38,961
a little bit later in the
presentation. What is an
77
00:05:39,021 --> 00:05:44,091
emotional machine? Is that an
oxymoron? What does that mean? A
78
00:05:44,091 --> 00:05:48,206
machine with emotion? Think
about that. Just let that
79
00:05:48,365 --> 00:05:54,624
marinate for a bit. One of the
ways that scientists who do this
80
00:05:54,624 --> 00:05:57,608
work, computer scientists who
do this work, are able to get
81
00:05:57,689 --> 00:06:01,454
machines to do what they do is
they use something called
82
00:06:01,613 --> 00:06:05,850
algorithms. Those of you in math
and those of you in engineering
83
00:06:05,850 --> 00:06:08,480
and data management, you're
very familiar with algorithms,
84
00:06:08,940 --> 00:06:13,226
but basically algorithms are
rules that data is given to make
85
00:06:13,226 --> 00:06:17,413
it, in my estimation, in my
opinion, form a particular
86
00:06:17,473 --> 00:06:22,218
pattern to get something done.
So this is all developed ahead
87
00:06:22,257 --> 00:06:26,303
of time. But so the broad
umbrella of what we're talking
88
00:06:26,343 --> 00:06:29,949
about when we talk about AI,
chat box AI stands for
89
00:06:30,088 --> 00:06:34,634
artificial intelligence, chat
box is the broad umbrella is
90
00:06:34,834 --> 00:06:40,007
artificial intelligence. My
definition thinking machines or
91
00:06:40,067 --> 00:06:44,607
teaching machines how to think.
Well, what in the world is a
92
00:06:44,706 --> 00:06:51,548
chat box? CHATBOT? Well,
obviously it's made up word that
93
00:06:51,548 --> 00:06:57,541
chat means to talk or to speak,
and bot like robot. Okay, so
94
00:06:57,562 --> 00:07:04,680
think of a talking robot as a
chat bot. Okay, the official
95
00:07:04,721 --> 00:07:08,045
definition is a chat bot is a
program that communicates with
96
00:07:08,125 --> 00:07:14,033
you, most commonly using the
texts interface and artificial
97
00:07:14,132 --> 00:07:20,168
intelligence, when you get in
your car and you use your GPS.
98
00:07:22,540 --> 00:07:33,701
Your GPS is not human. I don't
mean a virtual bubble. That GPS
99
00:07:33,740 --> 00:07:39,725
is not a human. It is a chat bot
of type. You know it actually
100
00:07:39,766 --> 00:07:43,901
is a very early, rudimentary
chat bot. You know, it's not
101
00:07:43,920 --> 00:07:46,148
even at the higher levels that
we're going to talk about today,
102
00:07:46,148 --> 00:07:50,867
but that's what a chat bot is.
So how does that work in terms
103
00:07:50,927 --> 00:07:55,785
of where we are today? I would
suggest you this is this
104
00:07:55,904 --> 00:08:00,072
information is in the public
domain in the last six months.
105
00:08:00,560 --> 00:08:03,826
So this is not stuff that the
public has found out about
106
00:08:04,047 --> 00:08:06,833
recent, you know, five or 10
years ago, even though it's been
107
00:08:06,833 --> 00:08:10,745
around. But we're learning
about it in the last six months
108
00:08:10,807 --> 00:08:15,641
or so and we're go with that.
Remember when I talked about
109
00:08:15,721 --> 00:08:22,382
your academic hand computer. I
eat your cell phone. It's also a
110
00:08:22,382 --> 00:08:26,261
computer and we already talked
about this. But let me sort of
111
00:08:26,420 --> 00:08:33,563
use that as an example to talk
about the chat bot. Okay, first
112
00:08:33,624 --> 00:08:42,240
a bit of history. The first
company to come out with a chat
113
00:08:42,341 --> 00:08:47,616
bot is Microsoft. Now, obviously
, many of you are familiar with
114
00:08:47,657 --> 00:08:51,312
Microsoft If you have a computer
and you may be using Microsoft
115
00:08:51,331 --> 00:08:58,177
software in your computer. I do,
i'm old school. As a matter of
116
00:08:58,217 --> 00:09:01,833
fact, these presentations are
done with a product of Microsoft
117
00:09:01,833 --> 00:09:06,498
called PowerPoint. I've used
PowerPoint for those 30 plus
118
00:09:06,538 --> 00:09:14,399
years. Microsoft is a very big,
world-wide computer operating
119
00:09:14,440 --> 00:09:19,578
system. Microsoft basically
heard about a company called
120
00:09:20,019 --> 00:09:26,013
OpenAI. Microsoft then decided
and we'll get into this later to
121
00:09:26,013 --> 00:09:30,739
work with this company. The
very earliest form of a chat bot
122
00:09:30,739 --> 00:09:38,158
is something that's called
ChatGPT. This is from Microsoft.
123
00:09:38,158 --> 00:09:44,176
I'm just going to deal with the
big players with computers, or
124
00:09:44,235 --> 00:09:48,177
talking computers, if you will.
Let me just go over the list.
125
00:09:48,510 --> 00:09:53,216
Microsoft is the first, google
is second, meta or Facebook is
126
00:09:53,297 --> 00:09:59,601
third and Baidu, which is the
Microsoft, if you will, of China
127
00:09:59,601 --> 00:10:05,856
. It's the last one, but
Microsoft's product is called
128
00:10:05,998 --> 00:10:11,860
ChatGPT. Gpt just stands for
some computer iterated language.
129
00:10:11,860 --> 00:10:15,254
It doesn't really mean anything
other than that, but this is
130
00:10:15,274 --> 00:10:18,816
their first product. This
product rolled out to the public
131
00:10:18,816 --> 00:10:23,650
2022 last year. We're not
talking about a long time, but
132
00:10:23,671 --> 00:10:29,153
free. Basically, it's free. It's
called ChatGPT. Indeed, if you
133
00:10:29,254 --> 00:10:33,011
have a Microsoft computer, it's
already on your computer. It's
134
00:10:33,091 --> 00:10:36,772
already there. If you don't know
where to look, go to your
135
00:10:36,812 --> 00:10:41,671
browser. Go to where you search
and type in ChatGPT, and then
136
00:10:41,711 --> 00:10:46,537
it'll pop up. Don't be scared,
it'll pop up on your computer.
137
00:10:49,975 --> 00:10:59,452
The very basic type of a Chatbot
named ChatGPT. Chatgpt is the
138
00:10:59,592 --> 00:11:05,592
name of the Chatbot that
Microsoft owns. This is called
139
00:11:05,633 --> 00:11:12,336
Chatbot or GPT 3.0. Now
Microsoft has already, within
140
00:11:12,778 --> 00:11:16,335
months, come out with an
advanced version of that called
141
00:11:16,676 --> 00:11:25,374
GPT4. Gpt4 you have to pay for.
Gpt3 is free. Gpt4 you have to
142
00:11:25,414 --> 00:11:29,179
pay for I think it's like $40 a
month or something like that.
143
00:11:30,471 --> 00:11:35,615
But if you have Bing, this is
another operating system with
144
00:11:35,735 --> 00:11:39,312
Microsoft. If you have Bing on
your computer, you already have
145
00:11:39,373 --> 00:11:46,677
GPT4 for free. Let me go over
that again. Gpt3 is the early
146
00:11:48,039 --> 00:11:53,115
example of the Chatbot that was
made by Microsoft, which is the
147
00:11:53,177 --> 00:11:56,072
leading country because it's out
with this technology First for
148
00:11:56,091 --> 00:12:03,077
the public. The advanced is GPT4
, but you already have GPT4 on
149
00:12:03,097 --> 00:12:07,979
your computer for free. Google
has a product called Bart with
150
00:12:08,019 --> 00:12:12,530
Palm 2. They're already coming
out with their product, but
151
00:12:12,551 --> 00:12:15,839
they're a little behind
Microsoft. Then Facebook Meta
152
00:12:15,879 --> 00:12:19,912
has a product called Blenderblot
, and Baidu, the Chinese
153
00:12:21,758 --> 00:12:25,052
operating system, has one called
Ernie, which I found
154
00:12:25,173 --> 00:12:30,053
interesting. I'm sure there's a
message there, but I just don't
155
00:12:30,094 --> 00:12:35,431
understand it. Now just make
sure that you understand what I
156
00:12:35,471 --> 00:12:40,379
mean by Microsoft. You are
familiar with Microsoft products
157
00:12:40,379 --> 00:12:44,032
. You'll either use Word when
you type the letter, you type
158
00:12:44,052 --> 00:12:49,072
the paper. You may use Excel for
numbers If you do any type of
159
00:12:49,193 --> 00:12:53,794
number accumulation or
accounting. As I mentioned about
160
00:12:53,794 --> 00:12:57,991
PowerPoint, these are the
products of Microsoft. They call
161
00:12:57,991 --> 00:13:06,942
Microsoft Office. The company
that started the Chat GPT is
162
00:13:07,022 --> 00:13:11,871
called OpenAI. That's at the top
. This is the small company I
163
00:13:11,912 --> 00:13:14,937
think. Right now I think it's
main office in San Francisco,
164
00:13:15,149 --> 00:13:24,321
california small company. It's
the one that started Chat GPT.
165
00:13:25,370 --> 00:13:28,192
Now, i think you'll be
interested in the history of how
166
00:13:28,192 --> 00:13:33,859
this came about. Openai is the
name of the company that started
167
00:13:33,859 --> 00:13:38,774
Chat GPT. It was founded in
2015,. Not too long ago.
168
00:13:39,950 --> 00:13:45,120
Microsoft, being the gentle
giant that it is in the tech
169
00:13:45,182 --> 00:13:49,951
industry, is always looking,
obviously worldwide, for new
170
00:13:50,030 --> 00:13:54,793
technology and obviously
potential competition. Microsoft
171
00:13:54,793 --> 00:14:00,700
approached OpenAI, the company
founded in 2015 in San Francisco
172
00:14:00,700 --> 00:14:05,918
, and said we like your product,
we want to invest in your
173
00:14:05,957 --> 00:14:11,895
product. We're going to invest
one billion, with the B dollars,
174
00:14:11,895 --> 00:14:17,120
into your company. Of course,
your small startup tech company
175
00:14:17,490 --> 00:14:20,894
and Microsoft comes in and says
they want to invest a billion
176
00:14:20,933 --> 00:14:29,241
dollars. What do you do? You
said sure, bring it, i'll take
177
00:14:29,322 --> 00:14:36,599
it, stay with me. Now, four
years later, 2023, this year,
178
00:14:37,510 --> 00:14:43,011
microsoft says we're even more
impressed with your company,
179
00:14:43,172 --> 00:14:49,898
openai, and your product, chat
GPT. We want to increase our
180
00:14:49,957 --> 00:14:54,653
investment to an additional 10
billion. Stay with me now. This
181
00:14:54,693 --> 00:14:58,836
is 11 billion dollars from
Microsoft to a little company in
182
00:14:58,836 --> 00:15:02,618
San Francisco. You may say well
, why in the world is Microsoft
183
00:15:02,658 --> 00:15:05,711
doing that? Well, obviously, if
you're a worldwide company and
184
00:15:05,792 --> 00:15:09,292
you've got information
throughout the world and there
185
00:15:09,312 --> 00:15:12,833
was something Microsoft saw that
it liked, and that's that
186
00:15:12,894 --> 00:15:16,859
bullet point. The next bullet
point that in December alone of
187
00:15:16,958 --> 00:15:23,361
last year, a million people
signed up with Chat, gpt. A
188
00:15:23,422 --> 00:15:27,686
million people in one month.
Okay, you think that was
189
00:15:27,725 --> 00:15:32,846
something. In January, the next
month, over a hundred million
190
00:15:32,886 --> 00:15:36,984
people signed up for the same
product. So what do you think
191
00:15:37,004 --> 00:15:42,102
Microsoft is thinking now? We
got a good one here, okay. So
192
00:15:42,143 --> 00:15:46,610
what's my point? My point is to
show you the speed of this stuff
193
00:15:46,610 --> 00:15:49,849
. This is not stuff that happens
in six months and then another
194
00:15:49,908 --> 00:15:53,760
three months. This is happening
daily, weekly, monthly. The
195
00:15:53,821 --> 00:15:58,044
speed which goes along with
computers. What is unique about
196
00:15:58,085 --> 00:16:02,461
computers compared to humans?
One of the things is speed. Okay
197
00:16:02,461 --> 00:16:06,264
, that a computer can do what I
can do. The computer can do it
198
00:16:06,323 --> 00:16:09,808
in a second. A second is when
you say one, one thousand,
199
00:16:10,260 --> 00:16:13,065
that's a second, that's one
second. One, one thousand, two,
200
00:16:13,144 --> 00:16:15,265
one thousand, three, one
thousand four, one thousand five
201
00:16:15,265 --> 00:16:18,482
, one thousand Five seconds just
went by. The computer is
202
00:16:18,543 --> 00:16:23,684
already done then. So this is
the area that we're in Now. Let
203
00:16:23,705 --> 00:16:29,322
me put this in context. I'm a
baby boomer, i'm 75 years old. I
204
00:16:29,322 --> 00:16:35,462
still remember rotary
telephones. I remember There
205
00:16:35,482 --> 00:16:40,524
were no cell phones in my
neighborhood. So I'm being
206
00:16:40,585 --> 00:16:45,404
transported now to a whole new
area And, to be honest with you,
207
00:16:45,404 --> 00:16:51,043
i find that exciting. I said
cool, bring it, bring it, okay.
208
00:16:51,605 --> 00:16:54,320
So let's go into detail a little
more about chatbots, and
209
00:16:54,340 --> 00:16:57,802
particularly as it has to do
with healthcare, and I'll just
210
00:16:57,841 --> 00:17:00,663
go through this relatively
quickly. Well, how? does this
211
00:17:01,205 --> 00:17:07,307
talking computer process,
thinking machine, help in a
212
00:17:07,347 --> 00:17:10,498
hospital, whatever, whatever.
Well, ironically it helps in a
213
00:17:10,573 --> 00:17:15,021
number of ways And I will submit
to you if you are, if you get
214
00:17:15,041 --> 00:17:18,420
your healthcare in a large
healthcare delivery system And
215
00:17:18,440 --> 00:17:22,201
I'm talking about a large
university related healthcare
216
00:17:22,221 --> 00:17:25,800
delivery system or Kaiser or
something like that You are
217
00:17:25,881 --> 00:17:28,824
already interacting with these
bots and you just don't know it.
218
00:17:28,824 --> 00:17:33,481
You think you're dealing with
humans, you're not, and, believe
219
00:17:33,481 --> 00:17:36,343
it or not, some of that is a
good thing, because the robot
220
00:17:36,383 --> 00:17:39,881
just keeps doing what it does,
it doesn't get tired, it doesn't
221
00:17:39,881 --> 00:17:43,144
get frustrated And it just keep
doing it and doing it, and
222
00:17:43,185 --> 00:17:48,343
doing it. But so five ways that
it works in healthcare is
223
00:17:48,644 --> 00:17:51,544
providing informational support.
Clearly, one of the most
224
00:17:51,584 --> 00:17:54,061
frustrating things is about
being a patient, having a family
225
00:17:54,061 --> 00:17:56,903
member who's a patient is
getting information. What's
226
00:17:56,943 --> 00:17:59,369
going on, why are you doing this
, why are you doing that, what
227
00:17:59,410 --> 00:18:02,183
does this mean, What did that
show? Et cetera, et cetera, et
228
00:18:02,223 --> 00:18:08,065
cetera. Second, scheduling
appointments. Now, when I want
229
00:18:08,085 --> 00:18:13,362
to see my doctor or whatever, i
go to my computer, i pull down
230
00:18:13,382 --> 00:18:18,566
the website of my healthcare
delivery system I happen to have
231
00:18:18,566 --> 00:18:23,469
Kaiser And I just dial in. I
want to see blah, blah, blah for
232
00:18:23,469 --> 00:18:28,306
X And I do it all online is my
point. I never really talked to
233
00:18:28,326 --> 00:18:34,384
a human And I sort of like that
quite honestly. It collecting
234
00:18:34,424 --> 00:18:38,689
patient information. Obviously,
when you sign into the hospital
235
00:18:38,839 --> 00:18:42,247
or you go into a clinic, they
have you fill out a lot of stuff
236
00:18:42,247 --> 00:18:46,323
And all that stuff goes into a
computer And that computer then
237
00:18:46,363 --> 00:18:49,903
becomes more knowledgeable about
you Providing medical
238
00:18:49,963 --> 00:18:53,866
assistance. When the physician
wants to know more and more
239
00:18:53,926 --> 00:18:58,121
information, he goes to that
data bank and can get it And
240
00:18:58,181 --> 00:19:03,009
assist in refilling. When you,
if you take a lot of medication
241
00:19:03,740 --> 00:19:07,663
and you need to refill your
medication many times, you can
242
00:19:07,703 --> 00:19:13,003
do that online And that process
is facilitated by an artificial
243
00:19:13,184 --> 00:19:18,387
intelligent machine, by an
artificial intelligent machine.
244
00:19:19,500 --> 00:19:22,520
Well, how do these chat about
things help doctors? How does
245
00:19:22,540 --> 00:19:27,881
these computers help doctors?
And let me just add here if you
246
00:19:27,921 --> 00:19:32,726
remember back to some of the
previous masterclass lectures,
247
00:19:33,160 --> 00:19:37,461
we talked about the Affordable
Care Act, obamacare that was
248
00:19:37,541 --> 00:19:41,924
passed in 2010, 13 years ago,
and we mentioned that one of the
249
00:19:41,924 --> 00:19:46,561
main factors of Obamacare was
it forced healthcare to go from
250
00:19:47,464 --> 00:19:52,328
to computers. This is a part of
that process that started 13
251
00:19:52,388 --> 00:19:57,921
years ago. So how do we
physicians use the chatbot? Well
252
00:19:57,921 --> 00:20:00,566
, it helps us with clinical
decision support. What does that
253
00:20:00,566 --> 00:20:05,109
mean? That means is, let's say,
i admit my patient to hospital
254
00:20:05,310 --> 00:20:12,169
A and patients very sick, and
it's a condition that you don't
255
00:20:12,229 --> 00:20:17,564
see very often, let's put it in
that category. And so I think
256
00:20:17,604 --> 00:20:21,487
that I wanna do treatment A. And
so I type treatment A into the
257
00:20:21,507 --> 00:20:28,361
computer And the computer gives
me a red X or something, and I
258
00:20:28,381 --> 00:20:34,070
said whoa, what's that? And the
computer says that it will not
259
00:20:34,131 --> 00:20:38,183
accept my treatment
recommendation. And then I asked
260
00:20:38,183 --> 00:20:42,700
why? And the computer says I
need to talk to X or Y or Z. My
261
00:20:42,760 --> 00:20:46,781
point is it is helping me with
my decision regarding the
262
00:20:47,103 --> 00:20:50,241
treatment for that patient.
That's a good thing. Medicine
263
00:20:50,301 --> 00:20:54,359
changes every single day And it
is impossible for anybody I
264
00:20:54,380 --> 00:20:57,540
don't care how smart you are or
what school you went to to stay
265
00:20:57,641 --> 00:21:01,221
up with everything that's
current. On and on and on. Think
266
00:21:01,221 --> 00:21:05,604
about an artificial intelligent
chatbot computer and think
267
00:21:05,644 --> 00:21:10,862
about it in terms of a library,
a world library. Think about a
268
00:21:10,922 --> 00:21:14,144
library, but it's a world
library. And think about this
269
00:21:14,203 --> 00:21:18,409
world library that has every
single book in the world in its
270
00:21:18,568 --> 00:21:22,980
library, and think about all the
data and information in that
271
00:21:23,101 --> 00:21:28,027
world library that you can
download to one computer system.
272
00:21:28,027 --> 00:21:30,781
And that's what we're talking
about when we're talking about
273
00:21:31,123 --> 00:21:35,143
an artificial intelligent
chatbox. So the computer just
274
00:21:35,203 --> 00:21:38,486
goes to this database and says,
no, you need to do A, b and C.
275
00:21:39,200 --> 00:21:41,627
The same thing is with
literature search procedure
276
00:21:41,688 --> 00:21:45,942
review. One of the very
interesting areas that hospitals
277
00:21:45,942 --> 00:21:49,281
particularly like and doctors
also, is that you can get a lot
278
00:21:49,321 --> 00:21:50,705
of good feedback from patients.
279
00:21:52,871 --> 00:21:53,372
Speaker 4: Dr Choctaw
280
00:21:55,546 --> 00:21:57,119
Speaker 2: Quickly, let's drill
down a little bit more about
281
00:21:57,161 --> 00:22:00,864
this artificial intelligence
stuff, and particularly as it
282
00:22:00,903 --> 00:22:04,565
has to do with machine learning.
Now on the left side you see
283
00:22:06,009 --> 00:22:09,042
machine learning and deep
learning Again. Machine learning
284
00:22:09,042 --> 00:22:14,181
here is that repetition type of
learning. More superficial Deep
285
00:22:14,181 --> 00:22:20,002
learning is where you get into
perceptions and reasoning and
286
00:22:20,042 --> 00:22:23,503
that sort of thing. Go down to
the lower level, you get natural
287
00:22:23,503 --> 00:22:28,009
language processing. This is
how multiple languages around
288
00:22:28,028 --> 00:22:30,982
the world that has to be
processed. So it means the same
289
00:22:31,022 --> 00:22:37,324
thing for everybody Predictive
analytics. Ceos of companies
290
00:22:37,585 --> 00:22:43,726
love predictive analytics. Why?
Let's say you're the CEO of
291
00:22:43,786 --> 00:22:49,505
hospital A. Hospital A has 500
beds. You have 2,000 employees
292
00:22:49,565 --> 00:22:55,832
in hospital A. You treat, let's
say, 5,000 people in six months
293
00:22:55,980 --> 00:23:00,608
in hospital A, but it is
problematic in terms of knowing
294
00:23:01,160 --> 00:23:04,560
how many staff you need on a
Friday or a Sunday, how many
295
00:23:04,641 --> 00:23:07,886
nurses you need, how many
doctors are gonna be on call,
296
00:23:08,319 --> 00:23:10,422
how many emergencies are you
gonna get. So you have to bring
297
00:23:10,482 --> 00:23:14,104
in more of this and that You
have to guess. You have to guess
298
00:23:14,104 --> 00:23:16,961
And most of the times we guess
wrong. I mean, because we're
299
00:23:16,981 --> 00:23:21,800
human. The computer, the chatbot
, can do that for you And it
300
00:23:21,861 --> 00:23:25,980
does that for you with what it
calls predictive analytics. Go
301
00:23:26,040 --> 00:23:32,047
back to that world library
that's been downloaded to the
302
00:23:32,106 --> 00:23:36,064
chat box with all the world's
data. It can tell you in
303
00:23:36,144 --> 00:23:41,281
hospital A, looking at your data
for the last year, what's your
304
00:23:41,342 --> 00:23:46,240
staffing is gonna be like on a
Friday night in December. It can
305
00:23:46,240 --> 00:23:50,528
tell you that It can deduce it
from the data that you've given.
306
00:23:50,528 --> 00:23:55,121
That's just extraordinary. So a
CEO and executive team that has
307
00:23:55,121 --> 00:23:59,102
that much information. Again,
as an executive, your job is to
308
00:23:59,143 --> 00:24:02,862
predict the future, right,
because you have to guess Do I
309
00:24:03,564 --> 00:24:07,021
add this many people, or do I
get this much money? Do I take a
310
00:24:07,021 --> 00:24:09,644
loan? blah, blah, blah. The
computer can take all that away
311
00:24:09,704 --> 00:24:17,563
from you. Look at that third box
Sentiment analytics. Sentiment
312
00:24:17,864 --> 00:24:23,221
analytics Well, guess what
that's talking about? Sentiment
313
00:24:23,323 --> 00:24:28,569
analytics means that it can do a
better job of getting patient
314
00:24:28,670 --> 00:24:32,603
feedback that it can give to the
organization ie hospital,
315
00:24:32,663 --> 00:24:36,701
physician, et cetera in terms of
what patients think, and I
316
00:24:36,740 --> 00:24:40,500
think that's a good thing. I
think that's a good thing. I
317
00:24:40,540 --> 00:24:44,125
think that's a good thing that
doesn't rely on how much time I
318
00:24:44,207 --> 00:24:51,281
have or how relaxed I am or
available I am, if you can put
319
00:24:51,323 --> 00:24:54,064
your thoughts and feelings and
belief into a computerized
320
00:24:54,084 --> 00:24:59,529
system that goes directly to the
people in charge. That's a good
321
00:24:59,529 --> 00:25:05,441
thing. Well, i admit, this is a
cell phone with a I don't know
322
00:25:05,500 --> 00:25:10,268
a toy chat bot. I hesitate about
putting this picture in here
323
00:25:10,339 --> 00:25:15,327
because I didn't want to scare
anybody, but I decided to put it
324
00:25:15,327 --> 00:25:20,424
in anyway. All right. Now the
chat bot doesn't necessarily
325
00:25:20,444 --> 00:25:24,789
look like this. This is just, no
doubt, a computer rendition of
326
00:25:24,829 --> 00:25:28,548
what it is, but the point is,
the chat bot is there to help
327
00:25:28,628 --> 00:25:32,549
you. There's no question that
there are issues and concerns,
328
00:25:32,780 --> 00:25:36,806
and we'll talk about that a bit
later But it can be of help to
329
00:25:36,885 --> 00:25:39,708
you And I would encourage you to
look at how it can help you
330
00:25:40,339 --> 00:25:44,901
Very, very quickly. And we'll
just go this real fast Again
331
00:25:44,961 --> 00:25:47,981
hospital administration like
chat bots, because some
332
00:25:48,262 --> 00:25:52,887
protocols and HR stuff and all
the regulatory stuff that most
333
00:25:52,928 --> 00:25:56,323
of times is very difficult to
deal with. One of the places
334
00:25:56,363 --> 00:25:58,904
that chat bots have been found
to be helpful is with hospice.
335
00:25:59,960 --> 00:26:03,682
The many times individuals who
are in hospice are able to
336
00:26:03,741 --> 00:26:08,622
communicate. Think about texting
a friend. Think about texting a
337
00:26:08,622 --> 00:26:14,105
friend who has unlimited time,
limited capability and can
338
00:26:14,165 --> 00:26:18,467
listen to you 24 seven. Think
about how therapeutic that might
339
00:26:18,467 --> 00:26:22,800
be if you or your loved one or
your family member and or loved
340
00:26:22,981 --> 00:26:28,303
one happen to be in a hospice
situation. You want to know this
341
00:26:28,303 --> 00:26:34,586
guy, this guy, his first name
is Sam. So what is Sam doing
342
00:26:34,646 --> 00:26:40,695
here? Sam is actually testifying
in Congress. He started this.
343
00:26:40,736 --> 00:26:44,595
He and his people started,
discovered it. Microsoft bought
344
00:26:44,634 --> 00:26:48,914
it out. It spread worldwide. But
Sam look at his face, sam's a
345
00:26:48,954 --> 00:26:55,916
little worried. And so what he's
doing, which is smart, he's
346
00:26:55,936 --> 00:26:59,332
trying to be proactive or
preemptive. And he went to
347
00:26:59,372 --> 00:27:02,839
Congress and said I need to come
and talk to you guys in
348
00:27:02,900 --> 00:27:08,440
government because I've created
something here that you need to
349
00:27:08,500 --> 00:27:12,156
be aware of and you need to
start providing some regulation
350
00:27:12,196 --> 00:27:16,115
with it. You need to start
providing some regulation with
351
00:27:16,175 --> 00:27:22,013
it. Right, it's not 100%
accurate. So my point is that if
352
00:27:22,013 --> 00:27:24,432
you do a query in the chat
button, it gives you certain
353
00:27:24,452 --> 00:27:27,596
number of answers. Double check
those answers another way if
354
00:27:27,675 --> 00:27:32,528
possible, and I would argue
nothing is really 100%. Quite
355
00:27:32,548 --> 00:27:38,112
honestly, skepticism is never a
bad thing. Never a bad thing. So
356
00:27:38,112 --> 00:27:41,489
he says that sometimes he's
worried about people using the
357
00:27:41,528 --> 00:27:46,202
chat bot, putting it into a
political forum and then using
358
00:27:46,242 --> 00:27:51,256
that to spread disinformation.
He's worried, legitimately,
359
00:27:51,817 --> 00:27:58,290
about people losing jobs, and
that's obvious, right? I can
360
00:27:58,311 --> 00:28:03,010
remember when the main
mechanical robots came out, they
361
00:28:03,010 --> 00:28:05,789
said well, they're going to get
rid of blue collar jobs. Well,
362
00:28:06,250 --> 00:28:11,902
the chat bot, the AI chat bot is
going to get rid of white
363
00:28:11,922 --> 00:28:20,303
collar jobs writers, artists,
lawyers, individuals who think
364
00:28:20,864 --> 00:28:25,551
for living. It can even replace
artists. These chat bots can
365
00:28:25,612 --> 00:28:30,291
draw art, they can create, they
can do everything. So just think
366
00:28:30,291 --> 00:28:33,751
about that on the world stage.
And if you happen to be a writer
367
00:28:33,751 --> 00:28:41,730
for a Harper's company and the
chat bot comes out and says I
368
00:28:41,750 --> 00:28:46,990
can write that 300-page book in
44 hours, right, think about
369
00:28:47,070 --> 00:28:49,759
time and money. And then you
understand the financial
370
00:28:49,878 --> 00:28:54,492
implications. And think about it
. Microsoft is not stupid. They
371
00:28:54,553 --> 00:29:00,233
didn't put $11 billion in this
company because they thought
372
00:29:00,253 --> 00:29:05,058
that they were going to lose
money. But my point is, even
373
00:29:05,140 --> 00:29:09,976
people who are creating the
product are concerned, and Sam
374
00:29:10,057 --> 00:29:14,278
is an example. This is another
guy. His name is Blake Lamone.
375
00:29:14,971 --> 00:29:19,276
He's a senior engineer or was a
senior engineer for Google. So
376
00:29:19,316 --> 00:29:21,151
remember, they're all doing it.
It's not just Microsoft, all of
377
00:29:21,192 --> 00:29:27,304
them are doing it. And his claim
get this that the chat bot, the
378
00:29:27,304 --> 00:29:32,579
machine, the robot, the
computer had gained sentience.
379
00:29:33,710 --> 00:29:36,417
Now I must admit to you, i
didn't know what sentience was.
380
00:29:37,769 --> 00:29:40,890
I don't remember hearing that
word. So I looked it up And
381
00:29:40,910 --> 00:29:45,680
sentience means the ability to
experience feelings and
382
00:29:45,799 --> 00:29:50,863
sensations. Think about that
with your computer. Do you want
383
00:29:51,104 --> 00:29:57,393
a computer that has feelings and
sensations? right, and Blake
384
00:29:57,452 --> 00:30:01,561
says that his job was to probe
the chat bot. Ask it questions
385
00:30:01,642 --> 00:30:04,457
24-7. What do you think about
this? What would you do with
386
00:30:04,497 --> 00:30:06,936
that? How would you handle this
scenario? How did you handle
387
00:30:06,957 --> 00:30:11,739
that scenario? And what Blake
has said is what he's learned
388
00:30:12,520 --> 00:30:17,920
disturbed him. It disturbed him
And when he tried to talk to the
389
00:30:17,920 --> 00:30:21,025
bosses, obviously at Google
they didn't want him talking
390
00:30:21,066 --> 00:30:23,115
about stuff like that And they
certainly didn't want him
391
00:30:23,154 --> 00:30:27,662
talking about it to the public.
But what got Blake fired was he
392
00:30:27,701 --> 00:30:32,537
gave documents to the Washington
Post newspaper to prove his
393
00:30:32,596 --> 00:30:37,310
fault and Google fired him. But
you know he proved his point. So
394
00:30:37,310 --> 00:30:42,760
my point is that even people
who are on the inside are
395
00:30:43,102 --> 00:30:46,690
sounding an alarm bell. They're
saying be wary of this stuff.
396
00:30:47,652 --> 00:30:50,700
Maybe we can't stop it, but you
just need to be careful with it.
397
00:30:50,700 --> 00:30:56,694
And thirdly, this is Jeffrey
Hinton. Jeffrey Hinton is an
398
00:30:56,875 --> 00:30:59,843
older guy who's called the
father of Artificial
399
00:30:59,843 --> 00:31:05,056
Intelligence. Think about that,
if the father of artificial
400
00:31:05,115 --> 00:31:13,085
intelligence is saying I'm
worried about chat GPT and
401
00:31:13,747 --> 00:31:17,180
computers with feelings, because
I think these computers are
402
00:31:17,220 --> 00:31:21,655
going to become more intelligent
than us. Think about it Now. I
403
00:31:21,675 --> 00:31:25,308
would admit that I'm a movie
buff and I know I'm dating a lot
404
00:31:25,308 --> 00:31:29,479
of you, but I can remember a
movie called War Games with
405
00:31:29,519 --> 00:31:35,405
Matthew Rodrick and the whole
thing was Matthew the teenager
406
00:31:35,445 --> 00:31:38,976
and his girlfriend were playing
around with the computer, and he
407
00:31:38,976 --> 00:31:42,413
was trying to show off to her.
So he broke into the national
408
00:31:42,453 --> 00:31:48,134
security computer system and
started talking to one of the
409
00:31:48,154 --> 00:31:50,800
computers in the big computer
system government computer
410
00:31:50,820 --> 00:31:54,998
system. And he went into the
computer and the computer said
411
00:31:56,750 --> 00:32:02,721
would you like to play a game?
And so the teenager's girlfriend
412
00:32:02,721 --> 00:32:05,518
and giggling and playing around
and they said sure, and so the
413
00:32:05,538 --> 00:32:09,573
computer gave them a list of
games. You can play chess, you
414
00:32:09,594 --> 00:32:12,481
can play blah, blah, blah. And
the teenager of course wanted to
415
00:32:12,481 --> 00:32:16,458
impress his girlfriend and said
I want to play Thermonuclea
416
00:32:16,538 --> 00:32:21,996
Warfare. And the computer said
OK, and the computer's name was
417
00:32:22,096 --> 00:32:25,634
Joshua. I remember the name. I
love that movie, i watched it at
418
00:32:25,634 --> 00:32:29,846
least 100 times And the problem
was the computer started
419
00:32:29,865 --> 00:32:33,865
playing the game for real And
the kid couldn't stop it and the
420
00:32:33,865 --> 00:32:36,690
adults were going crazy. But
everything ended up all right.
421
00:32:37,273 --> 00:32:41,252
But my point is that computers
are computers. The other thing
422
00:32:41,292 --> 00:32:49,684
that Hinton says is this process
scares him. You know I didn't
423
00:32:49,704 --> 00:32:52,680
write down everything He said.
That was one of his quotes. He
424
00:32:52,700 --> 00:32:56,554
said this scares me. Think about
the father. This is the father
425
00:32:56,574 --> 00:32:59,661
saying you know, one of my kids
scared me, or something to that
426
00:32:59,701 --> 00:33:02,779
event, and his concern is, you
know, he's not going to be able
427
00:33:02,819 --> 00:33:08,180
to play the game, that we need
to have some way to control it.
428
00:33:08,789 --> 00:33:12,635
Now, who's gonna do that?
Because the whole premise is we
429
00:33:12,695 --> 00:33:19,181
have a robot that has the
world's library data inside of
430
00:33:19,320 --> 00:33:22,690
it so that it can think faster
and smarter and quicker. It's
431
00:33:22,770 --> 00:33:26,333
like our Frankenstein to some
extent Another one of my
432
00:33:26,393 --> 00:33:32,076
favorite movies, by the way And
we built it. But then can we
433
00:33:32,116 --> 00:33:35,775
control it? So I don't want to
scare you, but I think it is
434
00:33:35,934 --> 00:33:41,613
important that I give you both
sides of the argument, the way I
435
00:33:41,613 --> 00:33:44,055
look at it. And again, if
you've been with us for the last
436
00:33:44,055 --> 00:33:48,779
six months or so, you're
familiar with one of my favorite
437
00:33:48,779 --> 00:33:52,794
parts of the brain called the
amygdala. Remember the amygdala?
438
00:33:52,794 --> 00:33:58,951
The amygdala is the machine
control center of the brain And
439
00:33:59,071 --> 00:34:05,272
I see the way I analyze this is
computers don't have amygdalas
440
00:34:05,573 --> 00:34:11,871
yet yet. Yet Now I don't even
know what an amygdala would look
441
00:34:11,871 --> 00:34:17,659
like in a computer or whatever.
But my point is the difference,
442
00:34:17,659 --> 00:34:20,137
what I've always thought the
difference between me and them
443
00:34:20,610 --> 00:34:25,532
is that I have emotions and they
don't. They may be smarter than
444
00:34:25,532 --> 00:34:30,333
me, but I can love and they
cannot because they don't have
445
00:34:30,393 --> 00:34:35,380
emotions. Well, whatever
computer learns what love is
446
00:34:35,989 --> 00:34:39,293
because then if you learn what
love is, you may learn what hate
447
00:34:39,293 --> 00:34:43,931
is okay Then that's a whole
slippery slope that you want to
448
00:34:43,952 --> 00:34:50,568
be careful of. So let's switch
here and look specifically at
449
00:34:50,608 --> 00:34:55,601
healthcare. Okay, what are the
areas that the chatbots are
450
00:34:55,641 --> 00:34:59,177
helpful in hospitals and
healthcare delivery that
451
00:34:59,257 --> 00:35:05,255
hospitals like, and why do
hospitals like them? Well, one
452
00:35:05,295 --> 00:35:09,195
of the reasons is, and as I
mentioned before at the
453
00:35:09,235 --> 00:35:14,862
beginning, the use of the
chatbot or the super-duper
454
00:35:14,909 --> 00:35:19,351
advanced thinking computer saves
money for the hospital. That
455
00:35:19,391 --> 00:35:23,112
does that directly and
indirectly. It provides
456
00:35:23,192 --> 00:35:27,813
cybersecurity, it helps with
clinical trials, it helps with
457
00:35:27,853 --> 00:35:32,530
work assistance, it helps with
nursing assistance. One of the
458
00:35:32,590 --> 00:35:41,197
ways that hospitals love using
the robot is in surgery. In
459
00:35:41,297 --> 00:35:45,972
surgery Now I should share with
you. As I said, i've been a
460
00:35:46,032 --> 00:35:49,737
surgeon for 50 years. As I
mentioned before, my medical
461
00:35:49,778 --> 00:35:53,030
school is celebrating our
50-year anniversary, because I
462
00:35:53,070 --> 00:36:01,255
graduated in 1983 from Yale
Medical School. The robot can
463
00:36:01,375 --> 00:36:09,637
generate a total close to $100,
over $100 billion, just by using
464
00:36:09,637 --> 00:36:14,177
the robot alone. Okay, this is
what one of the robots looks
465
00:36:14,277 --> 00:36:17,876
like that's used in surgery And
it's used in most types of
466
00:36:17,936 --> 00:36:21,150
surgery. Some are more
beneficial than others. These
467
00:36:21,170 --> 00:36:23,318
are the ones that are very
beneficial with the robot
468
00:36:23,690 --> 00:36:27,114
Prostate surgery, general
surgery, gynecological surgery,
469
00:36:27,175 --> 00:36:33,101
heart surgery. This is an
example of the robot being used
470
00:36:33,630 --> 00:36:37,195
in, say, heart surgery case. So
what do you notice immediately
471
00:36:37,255 --> 00:36:41,878
about this picture? You're right
, the surgeon is not standing at
472
00:36:41,878 --> 00:36:45,092
the table, he's sitting off to
the left side. You say wait a
473
00:36:45,152 --> 00:36:47,980
second. Aren't you supposed to
be up there at the patient?
474
00:36:48,472 --> 00:36:53,211
That's the way it is on TV. Well
, guess what? This is a new
475
00:36:53,273 --> 00:36:58,019
process And, as I said, this has
been around for about 15 years
476
00:36:58,769 --> 00:37:04,123
This is not last year, or last
November, or 21 or 20 or 1918.
477
00:37:05,550 --> 00:37:11,175
And it is growing in usage. I
know this because for the last
478
00:37:11,657 --> 00:37:15,190
four years, before I retired,
sent and retired, i was running
479
00:37:15,210 --> 00:37:19,391
around the country doing
accreditation work for the joint
480
00:37:19,391 --> 00:37:24,431
commission and helping to
accredited hospitals and making
481
00:37:24,492 --> 00:37:27,472
sure that they were following
the rules, and the majority of
482
00:37:27,512 --> 00:37:31,653
them are doing robotic surgery
in increasing numbers,
483
00:37:32,436 --> 00:37:37,451
increasing numbers. Another
example of robotic surgery Here
484
00:37:37,490 --> 00:37:40,135
we have two surgeries Because a
lot of times when we as a
485
00:37:40,175 --> 00:37:43,110
general surgeon I would be doing
a case and I would maybe call
486
00:37:43,130 --> 00:37:47,275
it in a urologist because my the
area that I'm gonna be working
487
00:37:47,315 --> 00:37:50,451
in is gonna be very close to the
ureters or the kidney And I
488
00:37:50,471 --> 00:37:55,699
want a urologist in there, or a
gynecologist would call me in,
489
00:37:56,590 --> 00:38:01,898
obstetrician would call me in
because of possible bowel injury
490
00:38:01,898 --> 00:38:05,590
. So it's not uncommon to have
two physicians working. So the
491
00:38:05,630 --> 00:38:10,190
patient's on the table, the
robot is over the patient, the
492
00:38:10,269 --> 00:38:14,795
nurse is standing by the table
but the surgeons are at the
493
00:38:14,876 --> 00:38:19,833
monitors like a computer screen,
computer mod. This is what the
494
00:38:19,873 --> 00:38:23,791
robot looks like. From a
different perspective. This is
495
00:38:23,831 --> 00:38:27,501
probably the most famous robot
is called the DaVinci XI.
496
00:38:29,213 --> 00:38:34,277
Surgeons love the robot. This is
how it looks like in another
497
00:38:34,297 --> 00:38:38,538
type of surgery. We see all the
plastic over it to keep the area
498
00:38:38,538 --> 00:38:46,001
sterile. But robotic surgery is
increasing in scope and speed
499
00:38:46,222 --> 00:38:51,411
in surgery. It's beneficial in
surgery. One of the reasons
500
00:38:51,452 --> 00:38:56,882
beneficial in surgery is there's
a smaller scar for the patient.
501
00:38:56,882 --> 00:39:01,552
Smaller scar means less
postoperative pain. The surgery
502
00:39:01,632 --> 00:39:05,231
is more accurate because
remember, this is super-duper
503
00:39:05,291 --> 00:39:11,114
intelligent assistant that the
surgeon has and can keep the
504
00:39:11,173 --> 00:39:15,411
surgeon out of danger more
efficiently, more effective, it
505
00:39:15,472 --> 00:39:21,719
helps to provide better patient
care. In summary, the AI future
506
00:39:21,860 --> 00:39:24,637
is now. The artificial
intelligence future is now.
507
00:39:25,110 --> 00:39:30,514
Chatbots are working right now.
Today, artificial intelligence
508
00:39:30,655 --> 00:39:35,976
is non-human learning. The more
advanced chatbox today is the
509
00:39:36,036 --> 00:39:41,155
ChatGPT-4 made by Microsoft. If
you have Bing operating system
510
00:39:41,275 --> 00:39:44,494
on your Microsoft computer, you
already have it for free.
511
00:39:46,210 --> 00:39:49,717
Chatbot is an advanced search
engine that communicates
512
00:39:49,838 --> 00:39:56,672
verbally with the patient and
with the physician. Robots have
513
00:39:56,693 --> 00:40:01,777
been using surgery for 15 years,
as I've said, but the other
514
00:40:01,838 --> 00:40:04,894
side of that, though, is one
needs to be cautious with the
515
00:40:04,934 --> 00:40:11,675
whole artificial intelligence
area. If you will, and don't
516
00:40:11,735 --> 00:40:16,233
hesitate to think and
re-evaluate and make sure that
517
00:40:16,273 --> 00:40:20,815
you have the most appropriate
use Presently. The most advanced
518
00:40:20,815 --> 00:40:26,237
Chatbot today is the Bing, and
this is the logo that you'll see
519
00:40:26,237 --> 00:40:29,936
on your computer screen. My
basic principles is I like to
520
00:40:29,998 --> 00:40:34,097
always end God is in charge. As
I've mentioned each and every
521
00:40:34,137 --> 00:40:38,713
time, god is the leader of my
world. I am a physician of faith
522
00:40:38,713 --> 00:40:45,396
, and because of that, he has
sustained me and allowed me to
523
00:40:45,456 --> 00:40:49,951
do what I have been blessed to
do. I have no bad days. I
524
00:40:50,192 --> 00:40:53,121
discovered many years ago that
whether the day was good or bad,
525
00:40:53,121 --> 00:40:56,338
it was up to me, and so I
decided that I had had enough
526
00:40:56,438 --> 00:40:59,639
bad days and I didn't want it
anymore. So I now only have good
527
00:40:59,639 --> 00:41:03,255
days, so I have great days.
Don't sweat the small stuff, and
528
00:41:03,255 --> 00:41:06,514
most stuff is small. I have
learned that whenever something
529
00:41:06,554 --> 00:41:10,494
bothers me or appears to
aggravate me, it's usually not
530
00:41:10,554 --> 00:41:15,494
that big a deal. So I've learned
to go slow, take a step back
531
00:41:16,396 --> 00:41:20,990
and just not worry about stuff.
Forgiveness is therapy. People
532
00:41:21,050 --> 00:41:24,822
will do or say things to you
that may very well aggravate you
533
00:41:24,822 --> 00:41:30,293
or bother you or hurt you or
give them. Doesn't matter who's
534
00:41:30,353 --> 00:41:34,313
right or wrong. Just forgive
them and you will be amazed that
535
00:41:34,313 --> 00:41:37,192
not only is that the right
thing to do, it is also
536
00:41:37,313 --> 00:41:41,476
therapeutic for you. And the
fifth thing is everything is a
537
00:41:41,516 --> 00:41:44,856
relationship. Most relationships
are built on three things
538
00:41:45,318 --> 00:41:49,052
mutual respect, mutual trust,
good communication. If you have
539
00:41:49,112 --> 00:41:53,291
those three things, you have a
good relationship. If you do not
540
00:41:53,291 --> 00:41:56,809
have those three things, then
you have some work to do to
541
00:41:56,871 --> 00:42:02,010
build that relationship up to
where it should be. Are there
542
00:42:02,030 --> 00:42:02,653
any questions?
543
00:42:03,871 --> 00:42:06,938
Speaker 5: What if you were
talking about the benefits of AI
544
00:42:06,938 --> 00:42:12,961
in the hospitals in relation to
patient satisfaction?
545
00:42:13,150 --> 00:42:13,331
Speaker 2: Yes.
546
00:42:15,251 --> 00:42:21,623
Speaker 5: Is that because of
the pain that the computer will
547
00:42:21,643 --> 00:42:25,538
communicate with the patients
before they leave, or how is
548
00:42:25,637 --> 00:42:27,461
that beneficial or faster?
549
00:42:27,710 --> 00:42:30,771
Speaker 2: Okay, let me repeat
the question. The question is
550
00:42:30,811 --> 00:42:34,873
that on one of my slides I had
some information about how the
551
00:42:34,913 --> 00:42:39,530
chatbot is a beneficial in terms
of patient satisfaction, and is
552
00:42:39,530 --> 00:42:41,976
that when the patient is still
in the hospital or when the
553
00:42:42,016 --> 00:42:48,117
patient goes home? This is
mainly used for survey of
554
00:42:48,159 --> 00:42:52,811
patients, and so it's mostly
when the patient goes home in
555
00:42:52,851 --> 00:42:55,813
terms of patient satisfaction
where the patient can give
556
00:42:56,072 --> 00:43:00,650
accurate and quick feedback
about their experience, and
557
00:43:00,690 --> 00:43:03,918
that's one of the ways that it
helps with patient sentiment and
558
00:43:03,918 --> 00:43:10,539
patient experience. Also, there
could be surveys that patients
559
00:43:10,559 --> 00:43:14,012
would take even while they're in
hospital And some hospitals do
560
00:43:14,052 --> 00:43:17,041
that also but they can do it
very quickly with the chatbot
561
00:43:17,831 --> 00:43:21,391
and not have to have a person or
a nurse to do that, and they
562
00:43:21,411 --> 00:43:26,492
can get that information faster
back to the hospital. Thank you.
563
00:43:26,492 --> 00:43:31,215
The question on one Yes, go
ahead.
564
00:43:35,958 --> 00:43:36,619
Speaker 4: I have a question.
565
00:43:37,380 --> 00:43:37,882
Speaker 2: Yeah, go ahead.
566
00:43:38,891 --> 00:43:41,851
Speaker 4: Okay, actually I have
two questions. Yeah, in that
567
00:43:41,911 --> 00:43:46,474
illustration with the DaVinci
robot, you had the two surgeons.
568
00:43:46,474 --> 00:43:51,360
Like you said, this is
technology that's a few years
569
00:43:51,521 --> 00:43:56,019
old. Has there been any growth,
with those two surgeons not
570
00:43:56,099 --> 00:43:59,217
being in the room, perhaps being
in another city and conducting
571
00:43:59,277 --> 00:43:59,778
surgeries?
572
00:44:01,313 --> 00:44:05,036
Speaker 2: No, no One, because I
think it would not be
573
00:44:05,076 --> 00:44:09,974
acceptable to do that. And two,
i think you would then get into
574
00:44:10,014 --> 00:44:13,902
situations of error and
litigation and that sort of
575
00:44:13,943 --> 00:44:16,876
thing. So it hasn't quite gotten
to that level where you could
576
00:44:16,896 --> 00:44:20,655
be absolutely absent. Now that
may happen in the future, but
577
00:44:20,675 --> 00:44:22,219
presently that's not the case.
578
00:44:22,690 --> 00:44:26,152
Speaker 4: Okay, Thank you. And
my second question is do you see
579
00:44:26,152 --> 00:44:30,159
a day coming when those two
surgeons would be completely
580
00:44:30,219 --> 00:44:33,164
replaced by another DaVinci
robot?
581
00:44:35,416 --> 00:44:43,054
Speaker 2: Maybe What answer? It
is possible Because I've gotten
582
00:44:43,054 --> 00:44:46,311
to the point now where almost
anything is possible. But, to
583
00:44:46,391 --> 00:44:48,614
answer your question, it is
possible because things are
584
00:44:48,675 --> 00:44:53,643
moving in the direction of the
robotics doing more and more and
585
00:44:53,643 --> 00:44:57,498
more. So that's not
inconceivable, to be honest with
586
00:44:57,498 --> 00:44:57,639
you.
587
00:44:58,411 --> 00:44:59,153
Speaker 4: All right, thank you.
588
00:45:00,309 --> 00:45:04,759
Speaker 1: The tools that you're
using. Are they the? same that
589
00:45:04,778 --> 00:45:06,242
they would be in your hand.
590
00:45:08,112 --> 00:45:10,911
Speaker 2: They're similar, as
they would be in my hand, For
591
00:45:11,010 --> 00:45:14,498
instance. They have extensions
and by sitting at the monitor
592
00:45:14,637 --> 00:45:17,682
I'm able to move their hand
because they're an extension of
593
00:45:17,724 --> 00:45:23,414
me. So they're sort of similar
but modified for the robot. So
594
00:45:23,494 --> 00:45:26,668
it's basically similar but it
may be a little longer, It may
595
00:45:26,708 --> 00:45:29,597
have a little dissonant And
remember, the robot doesn't get
596
00:45:29,637 --> 00:45:34,985
tired. I remember we used to
talk about five, six hours
597
00:45:35,045 --> 00:45:38,577
surgeries and what it would be.
The robot didn't get tired And
598
00:45:38,617 --> 00:45:41,952
remember, the surgeons are
sitting. So that's another
599
00:45:41,992 --> 00:45:45,659
advantage too. But there's maybe
a slight modification of the
600
00:45:45,739 --> 00:45:50,096
instruments with the robot. Any
other questions.
601
00:45:51,411 --> 00:45:59,436
Speaker 4: One last question.
Back to the chatbot, the AI
602
00:45:59,456 --> 00:46:02,121
chatbot. there are some concerns
about accuracy of the
603
00:46:02,181 --> 00:46:07,297
information that's coming back,
and you talked about that. Where
604
00:46:07,297 --> 00:46:10,905
do you think that's going And
how is that going to be
605
00:46:10,987 --> 00:46:13,715
addressed? I mean, obviously you
can't do it with legislation,
606
00:46:13,775 --> 00:46:18,224
but what's the tech industry
doing about the accuracy of the
607
00:46:18,306 --> 00:46:19,949
information that's coming back
from all the searches that
608
00:46:19,969 --> 00:46:22,525
they're doing? That's an
excellent question.
609
00:46:25,777 --> 00:46:28,708
Speaker 2: I don't think the
tech industry is doing as much
610
00:46:28,768 --> 00:46:32,143
as it could And I think that's
why a lot of them are going to
611
00:46:32,184 --> 00:46:35,193
Congress and say you guys need
to do something. And obviously,
612
00:46:35,655 --> 00:46:41,568
you know, a 65 year old man in
the Senate knows less about the
613
00:46:41,648 --> 00:46:46,355
tech stuff than I do, so they
can't do anything. I do believe
614
00:46:46,375 --> 00:46:51,514
that they are legitimately
worried, as they should be, but
615
00:46:51,675 --> 00:46:54,025
I think but to answer your
question specifically, i think
616
00:46:54,085 --> 00:46:57,972
it's a matter of buyers. Beware
Those of us like me and you and
617
00:46:58,074 --> 00:47:03,179
others, who are end users. We
have to have a certain degree of
618
00:47:03,179 --> 00:47:08,490
organization, and health is
suspicion about everything. For
619
00:47:08,550 --> 00:47:12,793
instance, i've started using the
chatbot And I've even prepared
620
00:47:12,813 --> 00:47:15,972
in this presentation. I use the
chatbot And I was very impressed
621
00:47:15,972 --> 00:47:19,594
because it's so fast, you know,
and literally when I put
622
00:47:19,673 --> 00:47:24,481
something into the, the request
or the search, it starts in a
623
00:47:24,621 --> 00:47:28,432
second telling me what I need to
know. But I would then also go
624
00:47:28,592 --> 00:47:32,710
out and maybe do a different
search to validate that. So I
625
00:47:32,751 --> 00:47:36,329
think we're going to have to do
more of that sort of thing, but
626
00:47:36,409 --> 00:47:39,175
I think it's going to take a
combination of all those things,
627
00:47:39,175 --> 00:47:42,480
but I think we're going to have
to do our part and not just
628
00:47:42,559 --> 00:47:46,652
rely 100% on this accuracy. And
they have actually said that the
629
00:47:46,652 --> 00:47:51,211
areas that is weakest is
opinion. Of course, if you ask
630
00:47:51,311 --> 00:47:55,599
it who's better Republican or
Democrat, you know now, it may
631
00:47:55,639 --> 00:47:59,291
give you what it thinks, but it
may not be accurate, and those
632
00:47:59,351 --> 00:48:02,978
types of questions are the ones
that you have to sort of take
633
00:48:03,018 --> 00:48:03,900
with a grain of salt.
634
00:48:05,349 --> 00:48:05,630
Speaker 4: Thank you.
635
00:48:08,409 --> 00:48:08,610
Speaker 2: Yes.
636
00:48:11,630 --> 00:48:16,253
Speaker 3: Thank you so much for
your presentation. I really
637
00:48:16,293 --> 00:48:20,500
liked the end of it where, after
you've given us all this news,
638
00:48:20,519 --> 00:48:25,092
you talk about my basic
principles And you know you can
639
00:48:25,152 --> 00:48:30,402
feel free to use that in all of
your presentations, because
640
00:48:30,481 --> 00:48:33,427
sometimes we need that reminder,
because there's so many things
641
00:48:33,467 --> 00:48:38,313
going on in the world today That
causes us to fear And you know,
642
00:48:38,313 --> 00:48:42,527
the word of God tells us not to
fear And when fear comes it's
643
00:48:42,588 --> 00:48:43,250
not from God.
644
00:48:44,231 --> 00:48:47,253
Speaker 2: Yeah, i think I got
most of what you said, but, to
645
00:48:47,353 --> 00:48:51,777
be honest with you, i decided to
do exactly what you suggest. I
646
00:48:51,856 --> 00:48:55,641
do use my basic principles in
every single presentation for
647
00:48:55,661 --> 00:48:58,282
the reason that you mentioned,
because I think it's important
648
00:48:58,563 --> 00:49:03,469
that I not only do I give you
didactic, healthcare, legal and
649
00:49:03,510 --> 00:49:06,353
financial information. I think
it's important that there be a
650
00:49:06,454 --> 00:49:10,478
second message there, and that
is about me personally and about
651
00:49:10,478 --> 00:49:14,344
what I've learned and what's
worked for me, and certainly my
652
00:49:14,443 --> 00:49:17,632
faith is a part of that. So,
thank you for your suggestion
653
00:49:17,954 --> 00:49:20,539
And I completely agree with you
and I've already decided to do
654
00:49:20,579 --> 00:49:21,541
that. Thank you.
655
00:49:22,510 --> 00:49:26,510
Speaker 3: Any other questions,
i just want to. I just want to
656
00:49:26,590 --> 00:49:31,556
ask one more thing. With the AI
intelligence, i think the
657
00:49:31,597 --> 00:49:38,003
concern is probably, when you
consider Bible prophecy, how we
658
00:49:38,043 --> 00:49:40,751
are headed towards one world
government. I mean, you talk
659
00:49:40,791 --> 00:49:45,871
about the move is there for one
world religion, there's one
660
00:49:45,931 --> 00:49:52,137
world money then to have no
borders. I'm interesting to hear
661
00:49:52,137 --> 00:49:57,702
your take on how you see AI
bringing all these one world,
662
00:49:57,762 --> 00:50:02,626
this and that together, and how
it's going to just affect the
663
00:50:02,686 --> 00:50:03,027
world.
664
00:50:03,951 --> 00:50:07,353
Speaker 2: Well, you know, to be
honest, i don't have a clue how
665
00:50:07,353 --> 00:50:10,538
that's going to work out. I I
I'm not familiar with the one
666
00:50:10,557 --> 00:50:13,940
world concept, but I think what
one of the things that's
667
00:50:14,041 --> 00:50:19,007
interesting and fearful at the
same time is no one knows, 12
668
00:50:19,047 --> 00:50:22,911
months from now, 24 months from
now, 36 months from now, not
669
00:50:22,971 --> 00:50:26,195
what the impact is going to be
with AI in multiple areas
670
00:50:26,615 --> 00:50:31,422
religion, healthcare, law,
medicine, finance, etc. Etc. I
671
00:50:31,563 --> 00:50:36,074
honestly don't know. But I think
if we have all about, if each
672
00:50:36,114 --> 00:50:39,822
of us has our basic principles
intact, it does not matter. But
673
00:50:39,862 --> 00:50:44,494
the world does We stand for what
we believe and become leaders?
674
00:50:44,554 --> 00:50:47,956
you know, it is not accidental
that this is called a leadership
675
00:50:47,956 --> 00:50:52,242
masterclass, because I believe,
as I said in my basic
676
00:50:52,262 --> 00:50:55,164
principles when I start out
leaders, leaders can change the
677
00:50:55,204 --> 00:50:59,755
world. So it's up for us to
stand and be leaders in all of
678
00:50:59,815 --> 00:51:03,059
that chaos, if you will, and
stand for what we believe is
679
00:51:03,139 --> 00:51:13,492
right, thank you. So, finally,
thank you all so much, and I, i,
680
00:51:13,492 --> 00:51:16,277
i like this slide. I came
across this slide and I like it
681
00:51:16,597 --> 00:51:18,961
because we always talk about
change and we talk about
682
00:51:19,041 --> 00:51:22,753
leadership. I like the idea of
us looking at the person in the
683
00:51:22,793 --> 00:51:26,320
mirror and saying What, what am
I doing to change the world?
684
00:51:26,360 --> 00:51:29,449
What am I doing to make people
love each other better? What am
685
00:51:29,570 --> 00:51:34,518
I doing to make people happier,
etc. So I like to say leave this
686
00:51:34,518 --> 00:51:38,469
with you be the change that you
want to see in the world. Let
687
00:51:38,489 --> 00:51:43,380
me also say that these lectures
are also on a new podcast that
688
00:51:43,400 --> 00:51:47,193
you will be hearing about, and
so if you have, if you listen to
689
00:51:47,193 --> 00:51:50,757
podcasts and you have a
platform that you download your
690
00:51:50,797 --> 00:51:54,682
podcast, look us up on your
platform and you'll be able to
691
00:51:54,722 --> 00:51:58,175
hear some of the same material.
Thank you and have a wonderful
692
00:51:58,215 --> 00:51:58,356
day.
693
00:52:28,052 --> 00:52:31,639
Speaker 1: And then share it
with your family, friends and or
694
00:52:31,639 --> 00:52:35,835
your coworkers. They'll be glad
you did So. until the next time
695
00:52:35,835 --> 00:52:39,831
, live your best possible life
the best possible way.