Transcript
1
00:00:00,010 --> 00:00:03,280
Dr William T. Choctaw: Just, just
for fun, uh, and I said, uh, and,
2
00:00:03,280 --> 00:00:04,970
and so I, I went to the chat bot.
3
00:00:04,970 --> 00:00:08,480
I said, tell me all the great things
you've done in the last 12 months.
4
00:00:08,719 --> 00:00:11,440
What, what have you learned
in the last 12 months?
5
00:00:11,960 --> 00:00:16,840
Uh, and so it says, well, uh,
we've made advances in powering
6
00:00:16,849 --> 00:00:18,450
new drug discovery processes.
7
00:00:18,880 --> 00:00:23,180
Think about the, uh, pandemic, and
think about vaccines, and think
8
00:00:23,189 --> 00:00:26,680
about the issue of, well, how fast
can you get that vaccine done?
9
00:00:26,680 --> 00:00:29,250
It used to take months
and years to do that.
10
00:00:29,449 --> 00:00:30,900
Doesn't take months and years now.
11
00:00:31,040 --> 00:00:31,450
Why?
12
00:00:31,529 --> 00:00:34,710
Because the pharmaceutical companies
have artificial intelligence
13
00:00:34,940 --> 00:00:36,910
equipment in their organization.
14
00:00:37,450 --> 00:00:39,900
Uh, quantum computing
that solves problems.
15
00:00:40,510 --> 00:00:42,420
Um, that classic computers cannot solve.
16
00:00:42,600 --> 00:00:46,150
So what it's done is to take, it's
taken those big smart computers
17
00:00:46,340 --> 00:00:47,870
and it's taken them up a notch.
18
00:00:47,979 --> 00:00:50,590
So whatever they were doing,
this can do it faster.
19
00:00:51,810 --> 00:00:55,239
Uh, increased improvements in,
in cleaning and surgery, and
20
00:00:55,500 --> 00:00:57,290
we talked about that already.
21
00:00:57,749 --> 00:01:02,170
Um, the computers, the robots,
particularly in surgery, are smaller,
22
00:01:02,660 --> 00:01:04,239
uh, and can do the work faster.
23
00:01:04,720 --> 00:01:08,060
Um, what kind of network do
you have in your neighborhood?
24
00:01:08,380 --> 00:01:12,850
If you have a 5G network, you
know, that's because of the,
25
00:01:12,850 --> 00:01:14,649
uh, artificial intelligence.
26
00:01:15,020 --> 00:01:19,009
Because they want to push stuff out
to the customer so you can watch those
27
00:01:19,049 --> 00:01:23,560
movies and games faster with fewer
interruptions, et cetera, et cetera.
28
00:01:24,159 --> 00:01:30,070
Um, and climate change mitigation is
working overtime, uh, for some to help
29
00:01:30,070 --> 00:01:32,160
to decrease the climate change problem.
30
00:01:32,765 --> 00:01:33,835
So, what are the risks?
31
00:01:33,835 --> 00:01:34,585
Very quickly.
32
00:01:34,765 --> 00:01:38,955
The risk is, uh, just like we
always said about any computer,
33
00:01:39,315 --> 00:01:41,145
garbage in, garbage out.
34
00:01:41,455 --> 00:01:44,365
Somebody has to put the
information in the computer.
35
00:01:44,595 --> 00:01:46,244
Somebody has to give it the data.
36
00:01:46,455 --> 00:01:48,565
Somebody has to give it the proposition.
37
00:01:48,814 --> 00:01:50,744
Somebody has to give it the problem.
38
00:01:51,044 --> 00:01:53,585
That somebody is a human, initially.
39
00:01:53,774 --> 00:01:56,475
So, who is that human or humans?
40
00:01:57,090 --> 00:01:58,850
What, what is their understanding?
41
00:01:59,320 --> 00:02:01,320
Uh, what, what are their proclivities?
42
00:02:01,640 --> 00:02:02,710
Are they biased?
43
00:02:02,950 --> 00:02:06,810
Are they all of a
particular race or culture?
44
00:02:07,280 --> 00:02:10,950
And so, is what they put
into that computer skewed?
45
00:02:10,999 --> 00:02:15,680
And sometimes, um, unconsciously, you
know, that means they're bad people.
46
00:02:15,999 --> 00:02:18,870
And so, these are some of the things
that one has to be careful of.
47
00:02:19,660 --> 00:02:25,340
Uh, with, with, um, information from,
uh, the artificial intelligence vehicle.
48
00:02:25,790 --> 00:02:29,300
Uh, and so you have to make, make
sure that information makes sense.
49
00:02:29,639 --> 00:02:32,400
Another reason why you want to know
what your children are watching,
50
00:02:32,859 --> 00:02:34,840
and what they're listening to, okay?
51
00:02:35,289 --> 00:02:35,989
Safety.
52
00:02:36,580 --> 00:02:40,249
Um, you could imagine, just like
there are good people in the world,
53
00:02:40,249 --> 00:02:41,370
there are bad people in the world.
54
00:02:41,490 --> 00:02:44,110
And those bad people in the world
are just as smart as some of
55
00:02:44,110 --> 00:02:45,200
those good people in the world.
56
00:02:45,410 --> 00:02:46,940
And they're working overtime.
57
00:02:47,230 --> 00:02:52,810
You see how they can use that information
and technology to, uh, continue
58
00:02:52,820 --> 00:02:54,960
along their bad or negative course.
59
00:02:55,160 --> 00:02:56,979
And so you want to be aware of that.
60
00:02:57,190 --> 00:03:00,330
And profiling and, and
deception, sort of like bias.
61
00:03:00,750 --> 00:03:06,040
That, um, uh, one, one of the chatbots
said things that, um, uh, tell me
62
00:03:06,040 --> 00:03:07,620
about the doctors in the country.
63
00:03:08,090 --> 00:03:11,280
And it says, well, the doctors are
male and they do this and do that.
64
00:03:12,260 --> 00:03:13,230
The reality is.
65
00:03:13,690 --> 00:03:19,690
More doctors graduated from medical
school in 2024, female, not male, right?
66
00:03:19,870 --> 00:03:22,490
So that's a bias in the system.
67
00:03:22,960 --> 00:03:26,259
No doubt put in when the data was put in,
and maybe back then that was the case.
68
00:03:26,420 --> 00:03:27,560
It's not the case now.
69
00:03:28,019 --> 00:03:30,500
And so those are some of the things
you have to be concerned about.
70
00:03:30,709 --> 00:03:34,154
And the same thing could be
racial bias and etc., etc., etc.
71
00:03:34,644 --> 00:03:35,354
Profiler.
72
00:03:35,634 --> 00:03:39,894
Well, if you are this height and,
and look this way and this color,
73
00:03:40,114 --> 00:03:44,304
more likely than not, you're going
to do X as compared to doing Y.
74
00:03:44,864 --> 00:03:45,344
Right?
75
00:03:45,534 --> 00:03:49,684
So the law enforcement said, well, I'm
going to stop him before he does X.
76
00:03:49,964 --> 00:03:52,334
And so I'm, I'm, I'm
protecting these other people.
77
00:03:52,975 --> 00:03:54,704
So again, you want to be careful of that.
78
00:03:55,495 --> 00:03:56,665
Knowledge is power.
79
00:03:56,855 --> 00:04:01,455
So if you're aware that that's a
particular problem with this, this
80
00:04:01,455 --> 00:04:05,605
technology, then you can make sure
that that doesn't cause a problem.
81
00:04:05,975 --> 00:04:08,904
And again, some of the things we
talked about in health care, clinical
82
00:04:08,904 --> 00:04:13,204
decision support, helping us doctors
to make the right decision about
83
00:04:13,264 --> 00:04:17,294
what to do with patients, literature
searches, procedure searches.
84
00:04:19,769 --> 00:04:22,050
This is particularly for hospitals.
85
00:04:22,540 --> 00:04:27,810
Real quickly before we close,
this gentleman, Jeffrey Hinton, he
86
00:04:27,810 --> 00:04:30,520
was vice president of Google, um.
87
00:04:30,709 --> 00:04:35,179
And he resigned last year, and one
of the reasons he resigned was,
88
00:04:35,680 --> 00:04:38,600
and he's considered the god father
of artificial intelligence Uh,
89
00:04:38,600 --> 00:04:40,283
I may have mentioned him before.
90
00:04:40,283 --> 00:04:41,684
But he's at Sccree, ups.
91
00:04:42,224 --> 00:04:47,765
His fear and his worry, and he had
to create AI, is that it's more
92
00:04:47,765 --> 00:04:49,555
intelligent than we are right now.
93
00:04:49,724 --> 00:04:52,754
Now, of course, the company said, no,
no, no, no, we're, we're working on
94
00:04:52,754 --> 00:04:56,374
it, we're, we have it under control,
there's nothing to worry about, but
95
00:04:56,374 --> 00:04:58,594
this guy helped invent it, and he quit.
96
00:04:59,239 --> 00:04:59,859
He quit.
97
00:05:00,189 --> 00:05:03,349
So I said, Hmm, I wonder why he quit.
98
00:05:05,510 --> 00:05:09,959
What does he know that I don't know,
since he created this, this technology.
99
00:05:10,369 --> 00:05:13,099
And so what I'm sure he
didn't have to work anymore.
100
00:05:13,550 --> 00:05:17,620
And so what he's doing is, I
think he feels a little guilty.
101
00:05:17,830 --> 00:05:21,710
And he's gone around trying to make it
right, you know, and educate people.
102
00:05:23,559 --> 00:05:26,780
But he's saying that it's
more intelligent than us.
103
00:05:27,234 --> 00:05:31,755
Right now that it will figure
out ways to manipulate us that
104
00:05:31,755 --> 00:05:33,684
we cannot stop working on it.
105
00:05:34,195 --> 00:05:38,655
Um, and, uh, and that he wanted
to be able to speak freely.
106
00:05:39,474 --> 00:05:43,354
Now, I don't know enough about it to do
all these things, but I do know enough.
107
00:05:43,715 --> 00:05:48,795
To just try to be careful and again,
go back to our original belief.
108
00:05:48,795 --> 00:05:49,875
Knowledge is power.
109
00:05:50,095 --> 00:05:51,025
Knowledge is power.
110
00:05:51,025 --> 00:05:52,625
I don't want something controlling me.
111
00:05:52,975 --> 00:05:57,145
I want to have enough knowledge and
information so at least I can protect
112
00:05:57,145 --> 00:05:59,895
myself and those that I care about again.
113
00:05:59,895 --> 00:06:04,593
The same thing we're talking about in
terms of limited memory and self aware.
114
00:06:04,593 --> 00:06:05,072
A.
115
00:06:05,072 --> 00:06:06,508
I real quickly.
116
00:06:06,508 --> 00:06:08,424
We talked about sentience.
117
00:06:08,705 --> 00:06:09,975
They're building a field.
118
00:06:10,415 --> 00:06:15,945
Um, let's go back and then you remember
way back from the first podcast or the
119
00:06:15,945 --> 00:06:23,515
first, uh, um, um, presentation we made
about this, uh, in terms of masterclass
120
00:06:23,815 --> 00:06:25,345
is our good friend, the amygdala.
121
00:06:26,234 --> 00:06:31,615
If we cut the brain in half, uh, that
red area there is called the amygdala.
122
00:06:32,175 --> 00:06:35,735
Uh, and the amygdala is the
emotion center of the brain.
123
00:06:36,085 --> 00:06:39,995
It's where the feelings come from,
good feelings, bad feelings, right?
124
00:06:40,735 --> 00:06:47,365
What distinguishes us, in my
opinion, from, um, uh, Copilot or
125
00:06:47,365 --> 00:06:51,735
Bard or some of these others that
they don't have amygdalas yet.
126
00:06:54,355 --> 00:06:54,725
Yet.
127
00:06:55,135 --> 00:06:56,664
Well, I should say that I know of.
128
00:06:57,320 --> 00:06:59,960
I should qualify that, right?
129
00:07:00,490 --> 00:07:06,210
Because clearly, my belief is that once
they get an amygdala or an amygdala
130
00:07:07,419 --> 00:07:13,880
type structure or function or that
functionality, They can then feel.
131
00:07:14,130 --> 00:07:15,020
They can then feel.
132
00:07:15,030 --> 00:07:17,990
They can be happy or they can be sad.
133
00:07:18,160 --> 00:07:19,460
What do you do when you're happy?
134
00:07:19,880 --> 00:07:21,529
What do you do when you're sad?
135
00:07:21,820 --> 00:07:22,170
Right?
136
00:07:22,400 --> 00:07:24,879
What, what, what do you do
when you're running around
137
00:07:24,880 --> 00:07:26,380
and everything is going well?
138
00:07:26,569 --> 00:07:28,160
What do you do when you're angry?
139
00:07:28,359 --> 00:07:30,179
Anger is an emotion.
140
00:07:30,419 --> 00:07:31,459
It's an emotion.
141
00:07:31,469 --> 00:07:33,400
Love is an emotion.
142
00:07:34,040 --> 00:07:39,280
Uh, and I think most of us humans
would admit we don't 100 percent
143
00:07:39,310 --> 00:07:43,939
understand all these emotions, let
alone control them 100 percent of
144
00:07:43,959 --> 00:07:46,309
the time, right, if we choose to.
145
00:07:46,770 --> 00:07:51,185
Um, think if you put that into
this powerful entity, That's
146
00:07:51,185 --> 00:07:52,775
smarter than you, right?
147
00:07:52,784 --> 00:07:54,914
Cause you made it and you made it.
148
00:07:54,914 --> 00:07:56,474
So it would be smarter than you.
149
00:07:56,815 --> 00:08:02,564
And then you, you give it the ability
to feel and experience emotion.
150
00:08:02,814 --> 00:08:03,774
I'll just stop there.
151
00:08:03,775 --> 00:08:07,544
I'm sure you're going to,
I'll let you marinate on that.
152
00:08:10,395 --> 00:08:15,065
I believe, and we're almost done, I
believe it is important that parents
153
00:08:15,065 --> 00:08:19,055
don't have to be experts on computers,
and I am certainly not suggesting that.
154
00:08:19,614 --> 00:08:23,414
I am suggesting, though, as a parent,
that you at least be able to speak
155
00:08:23,414 --> 00:08:25,075
some of the language of computers.
156
00:08:25,404 --> 00:08:26,164
Basic stuff.
157
00:08:26,765 --> 00:08:27,495
Basic stuff.
158
00:08:27,765 --> 00:08:29,365
Why do you need to speak
some of the language?
159
00:08:29,515 --> 00:08:31,304
So, I'll tell you what this is.
160
00:08:31,544 --> 00:08:32,534
No, no, no, don't do that.
161
00:08:32,664 --> 00:08:35,845
I mean, they would love it, because
they want to turn those tables, right?
162
00:08:36,015 --> 00:08:38,105
You remember what it was
like to be a teenager?
163
00:08:38,305 --> 00:08:39,534
You know, you wanted to assert.
164
00:08:40,505 --> 00:08:42,745
That manhood or womanhood or whatever.
165
00:08:43,174 --> 00:08:46,175
Uh, but, but, but you want
to speak the language.
166
00:08:46,565 --> 00:08:48,355
What, what, what do I
mean by speaking language?
167
00:08:48,724 --> 00:08:50,775
You want to always be able to communicate.
168
00:08:51,295 --> 00:08:53,464
Remember I always talk
about relationships, mutual
169
00:08:53,464 --> 00:08:54,875
respect, mutual trust.
170
00:08:55,305 --> 00:09:00,375
I believe the most important part of
that trilogy is good communication.
171
00:09:00,775 --> 00:09:02,364
You can fuss and fight with your kids.
172
00:09:02,364 --> 00:09:02,714
Okay.
173
00:09:02,754 --> 00:09:03,334
That's fine.
174
00:09:03,749 --> 00:09:07,920
But you never, never, never, never, never
want to lose communication with them.
175
00:09:08,470 --> 00:09:10,840
You don't want them to
stop talking to you, right?
176
00:09:11,390 --> 00:09:11,870
Why?
177
00:09:12,160 --> 00:09:14,959
Because if they don't talk to you,
you don't know what's going on.
178
00:09:15,420 --> 00:09:17,720
And if you don't know what's
going on, you can't help.
179
00:09:17,940 --> 00:09:19,480
I don't care how right you are.
180
00:09:19,480 --> 00:09:22,860
I don't care how, I think I
told the story, I don't know,
181
00:09:22,890 --> 00:09:24,370
but I'll tell it real quick.
182
00:09:24,780 --> 00:09:27,849
But when my sons, and they grew up in St.
183
00:09:27,849 --> 00:09:33,505
Stephen's, when my sons were teenagers,
They must've been preteen and teenager.
184
00:09:33,815 --> 00:09:37,075
I had a thing about where you dress a
certain way when you come to church.
185
00:09:37,455 --> 00:09:40,155
I was taught as God's
house, you be respectful.
186
00:09:40,465 --> 00:09:42,175
You don't dress any way come to church.
187
00:09:42,875 --> 00:09:43,874
Just, just my thing.
188
00:09:44,155 --> 00:09:47,335
And so my, I had this thing
about you should wear a tie.
189
00:09:47,734 --> 00:09:52,444
If you go to Sunday morning service,
well, they thought that was silly.
190
00:09:53,964 --> 00:09:56,295
I said, why do I have to wear a tie?
191
00:09:56,645 --> 00:09:58,344
What, what, what difference does it make?
192
00:09:58,694 --> 00:09:59,794
And I went through my whole day.
193
00:09:59,834 --> 00:10:00,865
They still thought it was silly.
194
00:10:01,795 --> 00:10:04,575
So I said, you know, the ultimate Trump.
195
00:10:04,874 --> 00:10:07,104
I am your father, right?
196
00:10:07,995 --> 00:10:09,864
You will do it, you
know, that whole thing.
197
00:10:09,865 --> 00:10:13,315
I learned a long time ago that that
just doesn't work after a while.
198
00:10:13,515 --> 00:10:14,285
But, but, um.
199
00:10:14,855 --> 00:10:18,105
Thank God, uh, that God
said, you know what?
200
00:10:18,405 --> 00:10:21,185
Really Choctaw, does it really matter?
201
00:10:21,454 --> 00:10:25,225
I mean, seriously, but this is my, my
conversation with the person in the
202
00:10:25,225 --> 00:10:28,780
mirror and, and I, and I came to the
inclusion, it, it really doesn't matter
203
00:10:29,245 --> 00:10:32,905
that what, what's most important that
they go to church, that that thing
204
00:10:32,905 --> 00:10:34,825
walk into the building, who cares?
205
00:10:35,185 --> 00:10:39,975
Continue that conversation with them,
uh, so that they can tell me when
206
00:10:39,975 --> 00:10:42,204
they think I'm being silly, right?
207
00:10:42,324 --> 00:10:45,120
Because if they're afraid of me, they
say, oh, no, I can't say that to dad.
208
00:10:45,120 --> 00:10:45,400
I'll get.
209
00:10:46,300 --> 00:10:50,859
Now, I don't know, so then I'm going
to continue down this path being wrong,
210
00:10:51,209 --> 00:10:52,650
and I'm certainly not reaching them.
211
00:10:52,780 --> 00:10:53,400
What's my point?
212
00:10:53,449 --> 00:10:57,790
My point is, you always want to keep
communication with your kids, and
213
00:10:57,849 --> 00:10:59,239
you don't have to always be right.
214
00:10:59,410 --> 00:11:02,219
The other thing I learned from
that very quickly is I went
215
00:11:02,219 --> 00:11:03,119
back and apologized to them.
216
00:11:03,880 --> 00:11:07,409
I went back and said, you know what,
you were right, and I was wrong.
217
00:11:07,430 --> 00:11:11,999
I was way out in the field, blah, blah,
blah, blah, blah, blah, blah, blah, blah.
218
00:11:12,159 --> 00:11:13,089
Why did I do that?
219
00:11:13,089 --> 00:11:14,634
Because I'm trying to teach them.
220
00:11:14,905 --> 00:11:16,725
that everybody makes mistakes.
221
00:11:17,265 --> 00:11:21,375
And when you make a mistake, the way you
get over it, the way you grow, the way
222
00:11:21,375 --> 00:11:23,695
you learn from it, is you acknowledge it.
223
00:11:24,105 --> 00:11:24,865
I'm sorry.
224
00:11:24,865 --> 00:11:25,545
I screwed up.
225
00:11:25,575 --> 00:11:26,385
I shouldn't have done it.
226
00:11:26,395 --> 00:11:27,414
Please forgive me.
227
00:11:27,754 --> 00:11:28,524
Blah, blah, blah.
228
00:11:28,525 --> 00:11:30,565
God is not finished with me yet.
229
00:11:30,995 --> 00:11:34,475
Whatever, whatever you want to say,
but acknowledge it and acknowledge
230
00:11:34,475 --> 00:11:38,995
it to the person, particularly
children, particularly children, that
231
00:11:38,995 --> 00:11:40,155
you're having this conversation with.
232
00:11:40,555 --> 00:11:45,735
You can find information about artificial
intelligence anywhere on, uh, on, online.
233
00:11:46,065 --> 00:11:50,365
I would encourage you to go to various
universes, Johns Hopkins, Harvard, Yale,
234
00:11:50,385 --> 00:11:55,725
Stanford, UCLA, USC, wherever you're
comfortable, and just put in artificial
235
00:11:55,725 --> 00:11:59,225
intelligence and children, and you'll,
you'll get a whole host of information.
236
00:12:00,610 --> 00:12:03,560
In summary, artificial
intelligence is not our future.
237
00:12:03,620 --> 00:12:04,450
It is our present.
238
00:12:05,200 --> 00:12:06,150
It is here now.
239
00:12:06,520 --> 00:12:08,890
I would, I would argue
it's been here for a while.
240
00:12:09,050 --> 00:12:13,709
The most important part about the
learning is really, um, uh, the machine
241
00:12:13,709 --> 00:12:18,580
learning, uh, which is the basic,
and then it goes all the way up to,
242
00:12:18,900 --> 00:12:21,380
um, the machine being able to feel.
243
00:12:21,570 --> 00:12:23,620
Uh, the chatbots are very, very nice.
244
00:12:24,020 --> 00:12:27,540
Uh, they're, they're put in
place to be, uh, evaluated and
245
00:12:27,540 --> 00:12:28,960
to be attracted to children.
246
00:12:29,210 --> 00:12:32,859
But as parents, it's our job to
protect our children from everything.
247
00:12:33,350 --> 00:12:36,530
You know, particularly those things
that are not in their best interest.
248
00:12:36,900 --> 00:12:40,819
Robots and, and certain types of
AI have been used for years, uh,
249
00:12:40,829 --> 00:12:45,749
but it's important to recognize
that these artificial intelligence
250
00:12:45,939 --> 00:12:51,874
types of material and, and, and
representatives have downsides.
251
00:12:52,084 --> 00:12:55,594
And we need to be aware of those
downsides because then and only
252
00:12:55,594 --> 00:12:57,484
then can we protect our children.
253
00:12:57,784 --> 00:12:59,724
Real quickly, my basic principles.
254
00:12:59,734 --> 00:13:01,984
I always like to, um, mention this.
255
00:13:02,185 --> 00:13:03,974
God is in charge of my life.
256
00:13:04,124 --> 00:13:04,974
It has been.
257
00:13:05,305 --> 00:13:09,584
Um, and, uh, he directs
where I go and what I do.
258
00:13:10,115 --> 00:13:12,545
Um, I, I don't have bad days.
259
00:13:12,715 --> 00:13:14,755
I figured out that my
days are good or bad.
260
00:13:15,015 --> 00:13:19,015
When I said they were, and so I
don't, I no longer say they are bad.
261
00:13:20,145 --> 00:13:22,795
Don't sweat the small stuff
and most stuff is small.
262
00:13:23,275 --> 00:13:24,535
Uh, I've learned that.
263
00:13:24,565 --> 00:13:28,255
And so I don't worry about little
things anymore as much as I used to.
264
00:13:28,634 --> 00:13:30,455
Forgiveness is therapy.
265
00:13:30,714 --> 00:13:33,765
So when people do or say
things to you that you feel.
266
00:13:34,134 --> 00:13:38,135
Um, uh, is, is not, uh, appropriate.
267
00:13:38,345 --> 00:13:39,064
Forgive them.
268
00:13:39,704 --> 00:13:40,395
Forgive them.
269
00:13:40,764 --> 00:13:43,814
Don't try to figure it out, don't
look at the facts, just forgive them.
270
00:13:43,994 --> 00:13:50,844
Um, uh, and finally, um, relationships,
everything is a relationship.
271
00:13:51,290 --> 00:13:54,320
Based on mutual respect, mutual
trust, good communication.
272
00:13:54,839 --> 00:13:58,189
Um, if you have those three things,
you have a good relationship.
273
00:13:58,609 --> 00:14:00,999
If you do not, then you have work to do.
274
00:14:01,219 --> 00:14:02,339
Are there any questions?
275
00:14:02,499 --> 00:14:04,169
Anybody online with any questions?
276
00:14:04,170 --> 00:14:05,639
Hi, Dr.
277
00:14:05,639 --> 00:14:05,729
Tucker.
278
00:14:06,449 --> 00:14:08,389
Um, I had a question.
279
00:14:08,389 --> 00:14:11,589
I don't know if it was a year or two ago.
280
00:14:11,589 --> 00:14:17,584
There was, um, I think it was New York,
or Somewhere and there was a stadium
281
00:14:18,055 --> 00:14:22,755
and there was a particular law firm
representing like a group And they had
282
00:14:22,765 --> 00:14:26,334
to do they had like face recognition
and they didn't let those people
283
00:14:26,395 --> 00:14:29,955
come into the stadium Is there any?
284
00:14:30,555 --> 00:14:34,084
Like is the regulation against
that like or is it just each
285
00:14:34,085 --> 00:14:35,505
company can do what they want?
286
00:14:35,505 --> 00:14:39,275
And if they don't like how you voted or
how you did something they can exclude
287
00:14:39,275 --> 00:14:43,105
you from a venue even though you paid
For the tickets and you know, I think
288
00:14:43,105 --> 00:14:46,115
it was like a basketball stadium or
some the basketball game or something
289
00:14:46,665 --> 00:14:50,375
Yes, it was, and that's an excellent
question, because, yeah, and your
290
00:14:50,375 --> 00:14:54,685
question goes to, well, what kind of
regulation is protecting us, the public?
291
00:14:55,205 --> 00:14:58,395
From all of these things that are
happening, and the reality is the
292
00:14:58,395 --> 00:15:03,855
regulation is not fast enough, and it's
not smart enough, because you're right,
293
00:15:04,005 --> 00:15:08,685
uh, law enforcement obviously is trying
to be ahead of the game, and if they
294
00:15:08,685 --> 00:15:13,255
can put spatial recognition and identify
you at a stadium among thousands of
295
00:15:13,265 --> 00:15:18,475
people, uh, you really have your rights
being more limited than they should be.
296
00:15:18,715 --> 00:15:23,925
The reality is that our Congress,
most of them are, I don't know,
297
00:15:24,229 --> 00:15:28,939
Septuagenarians or, or whatever,
and they don't even understand
298
00:15:29,150 --> 00:15:31,150
the, the artificial intelligence.
299
00:15:31,310 --> 00:15:32,270
So it's a problem.
300
00:15:32,560 --> 00:15:33,829
They're trying to catch up.
301
00:15:34,220 --> 00:15:38,589
They're using consultants and some
of the people with AI, uh, with the
302
00:15:38,590 --> 00:15:40,680
Microsoft and Google are trying to help.
303
00:15:41,140 --> 00:15:43,629
But it's not there yet,
to be honest with you.
304
00:15:43,900 --> 00:15:47,689
And so right now, we do
not have good regulation to
305
00:15:47,689 --> 00:15:49,680
protect the public right now.
306
00:15:50,515 --> 00:15:50,895
Thank you.
307
00:15:50,905 --> 00:15:51,505
Good question.
308
00:15:52,125 --> 00:15:52,935
Any other questions?
309
00:15:53,185 --> 00:15:53,585
Yes, sir.
310
00:16:00,925 --> 00:16:02,904
Some of what, uh, Faye was asking.
311
00:16:03,745 --> 00:16:05,415
The government is slow.
312
00:16:05,645 --> 00:16:10,535
Um, you know, if you watch CNN or
anything, you know, the process of getting
313
00:16:10,535 --> 00:16:14,665
things done is very deliberate, and it
goes back and forth and back and forth.
314
00:16:14,844 --> 00:16:17,924
And as we mentioned, this
artificial intelligence stuff
315
00:16:17,934 --> 00:16:19,864
moves in seconds and minutes.
316
00:16:20,244 --> 00:16:22,775
And so the government is
already at a disadvantage.
317
00:16:23,319 --> 00:16:28,020
It is trying to do some of those
things, but it is still very, very
318
00:16:28,020 --> 00:16:32,320
slow, and chances are, even though
it may do some things, it may not,
319
00:16:32,350 --> 00:16:33,910
it may not really do it fast enough.
320
00:16:34,559 --> 00:16:37,620
Um, I think that it's
already at the amygdala.
321
00:16:38,115 --> 00:16:40,115
But that's just me, to be honest with.
322
00:16:40,425 --> 00:16:40,815
Yes, ma'am.
323
00:16:43,285 --> 00:16:46,314
Way faster than regulation.
324
00:16:46,584 --> 00:16:46,805
Yeah.
325
00:16:46,905 --> 00:16:52,855
So by the time the government catches up
with what they're doing right now, the
326
00:16:52,855 --> 00:16:56,655
scientists have evolved 10, 15 years.
327
00:16:56,665 --> 00:17:01,820
So all we have to do
is then to Be careful.
328
00:17:02,270 --> 00:17:06,630
Well, I think, I think the government
is, it will do some things and they'll,
329
00:17:06,670 --> 00:17:08,280
they'll try to make it retroactive.
330
00:17:08,280 --> 00:17:12,119
And so I don't want it to sound like
that nothing is done, but to your point,
331
00:17:12,420 --> 00:17:16,200
scientists have already started before the
government even knew about the problem.
332
00:17:16,580 --> 00:17:19,129
Uh, most of the times, then when
the government finds out about it,
333
00:17:19,130 --> 00:17:20,899
it has to understand the problem.
334
00:17:21,349 --> 00:17:24,279
Um, and then after it understands
the problem is then got to
335
00:17:24,279 --> 00:17:26,660
agree of what the remedy is.
336
00:17:26,880 --> 00:17:30,610
And each one of those three stages,
Takes months, sometimes months and
337
00:17:30,610 --> 00:17:35,460
years, whereas, uh, this stuff is just
going, you know, in seconds again.
338
00:17:35,830 --> 00:17:39,590
Uh, so I think the government will
do some things, but it probably will
339
00:17:39,590 --> 00:17:44,350
not be 100 percent adequate, which is
why conversations like the ones we're
340
00:17:44,360 --> 00:17:46,009
having this morning are so important.
341
00:17:46,270 --> 00:17:50,889
We have to do it ourselves and, and
not rely on the government 100%.
342
00:17:51,090 --> 00:17:54,670
Yeah, the government may help, but
we, we've got to take care of our own.
343
00:17:55,160 --> 00:17:57,120
Uh, and again, knowledge is power.
344
00:17:59,195 --> 00:18:05,855
When you're talking about AI, I
remember watching the, um, I think
345
00:18:05,855 --> 00:18:12,684
it was Chicago police, um, on TV
and it had show, it was AI that they
346
00:18:12,685 --> 00:18:18,414
had used the system to determine
whether someone is guilty or innocent.
347
00:18:19,314 --> 00:18:26,284
And they made a mistake in regards about
people of color and it was never tested.
348
00:18:26,840 --> 00:18:27,990
On people of color.
349
00:18:28,190 --> 00:18:34,680
It was tested on the other race and
they accused as they went with AI
350
00:18:35,160 --> 00:18:40,920
and they accused a person of being
guilty of something and it was facial
351
00:18:40,989 --> 00:18:47,100
acknowledge on it and the person
was not guilty and end up dying.
352
00:18:47,409 --> 00:18:51,760
And this, this is one of the
problems with bias, with bias.
353
00:18:52,304 --> 00:18:58,425
If, if I, if I look at, at, at Jesse,
and he's wearing black shoes, and I
354
00:18:58,425 --> 00:19:03,165
say, all people who wear black shoes
are, and you can fill in the blank,
355
00:19:03,185 --> 00:19:06,514
whatever you want to fill in, that,
that's the way I view people with black
356
00:19:06,515 --> 00:19:11,844
shoes, so when I see somebody with
black shoes, I said, uh huh, I know.
357
00:19:12,195 --> 00:19:13,764
All because I have that belief system.
358
00:19:13,764 --> 00:19:17,425
Remember we talked about our beliefs
affects how we think and how we think
359
00:19:17,435 --> 00:19:22,125
affects how We feel and how we feel
affects how we act if I have that belief
360
00:19:22,125 --> 00:19:25,904
system that if you wear black shoes
You more likely than not are going
361
00:19:25,904 --> 00:19:30,224
to steal somebody's purse or whatever
However, you want to put it then it
362
00:19:30,224 --> 00:19:34,375
is not that uncommon that i'm going to
create stuff to that point in healthcare
363
00:19:34,405 --> 00:19:40,270
And i've talked about this before Uh,
we treated a heart attack A certain,
364
00:19:40,280 --> 00:19:43,889
we treated women with heart attack
the way, same way we treated with men.
365
00:19:44,280 --> 00:19:49,230
And that because it was based on
data that was biased against women.
366
00:19:49,819 --> 00:19:52,199
Because the information that
we used to come up with those
367
00:19:52,199 --> 00:19:53,610
ideas didn't include women.
368
00:19:54,050 --> 00:19:56,979
The same thing is true with,
with, with different racial.
369
00:19:56,999 --> 00:19:58,530
They didn't include African Americans.
370
00:19:58,790 --> 00:20:01,249
They were, they only included Caucasians.
371
00:20:01,699 --> 00:20:04,009
And, and so again,
garbage in, garbage out.
372
00:20:04,629 --> 00:20:09,000
Uh, and, and that, that's one of the
concerns, uh, but at least if you're aware
373
00:20:09,000 --> 00:20:14,220
of it, then you don't accept everything
just because they say it is, uh, I am, and
374
00:20:14,220 --> 00:20:18,509
I, I mentioned this in sense of it, I, I
won't, I'm really skeptical about numbers,
375
00:20:19,209 --> 00:20:23,810
you give me a number, you know, I'm going
to send, so where'd you get that number?
376
00:20:24,360 --> 00:20:26,170
And how did, how did they get it?
377
00:20:26,220 --> 00:20:26,710
Whatever.
378
00:20:27,060 --> 00:20:31,770
Because many times numbers and
pseudoscience or fake science
379
00:20:31,990 --> 00:20:33,870
are used to influence people.
380
00:20:34,230 --> 00:20:35,649
Because then it stops debate.
381
00:20:35,660 --> 00:20:38,279
You don't question, well, you
know, they said that, that the
382
00:20:38,280 --> 00:20:39,720
statistics say blah, blah, blah.
383
00:20:39,890 --> 00:20:41,139
Well, what's statistics?
384
00:20:41,400 --> 00:20:42,860
Who put those statistics together?
385
00:20:42,860 --> 00:20:43,450
Blah, blah, blah.
386
00:20:43,640 --> 00:20:47,170
So my, my suggestion is be a
little skeptical about everything.
387
00:20:47,780 --> 00:20:51,050
You know, that doesn't mean you're
distrustful, but ask questions.
388
00:20:51,470 --> 00:20:52,250
Ask questions.
389
00:20:52,630 --> 00:20:57,040
I, and I think our government will
help us, but I, I, I believe it won't
390
00:20:57,040 --> 00:21:02,360
be fastened and that we're going to
have to supplement that protection by
391
00:21:02,370 --> 00:21:04,230
having knowledge and power ourselves.
392
00:21:04,879 --> 00:21:10,659
Doc, let me, let me say this without
jeopardizing myself as much as I
393
00:21:10,659 --> 00:21:13,630
possibly can, uh, incident did happen.
394
00:21:14,160 --> 00:21:17,830
What we're doing, what we're trying
to do was put in the dump starts in,
395
00:21:18,050 --> 00:21:23,700
and I won't go into that detail, to,
uh, have the machine to, to adhere to
396
00:21:23,769 --> 00:21:25,530
our, uh, our, uh, our, uh, Okay, okay.
397
00:21:25,530 --> 00:21:29,220
That's what I'm saying about
it is the machine exceeds, like
398
00:21:29,250 --> 00:21:30,350
it thinks faster than we do.
399
00:21:30,500 --> 00:21:31,300
Yes, it does.
400
00:21:31,310 --> 00:21:35,430
So it exceeded the parameters that
we were trying to put in for it.
401
00:21:35,590 --> 00:21:36,110
Yes.
402
00:21:36,319 --> 00:21:37,399
They're still working on it.
403
00:21:37,439 --> 00:21:39,320
Yes, I understand.
404
00:21:39,419 --> 00:21:41,240
Uh, but that's as much
as I can say about it.
405
00:21:41,269 --> 00:21:42,699
No, I, I, I understand.
406
00:21:42,699 --> 00:21:43,369
I understand exactly.
407
00:21:43,749 --> 00:21:45,609
And my point is that.
408
00:21:46,030 --> 00:21:50,149
Since, since we humans are catching
up, you know, and, and we, we didn't
409
00:21:50,149 --> 00:21:51,639
know how this stuff's going to happen.
410
00:21:51,909 --> 00:21:55,189
So, so we have the first thing
is being aware of the problem.
411
00:21:55,560 --> 00:21:57,749
Now we can come up with
ways to mitigate that.
412
00:21:57,860 --> 00:22:01,909
You know, we could modify the machines,
change the data, do different things.
413
00:22:02,230 --> 00:22:03,590
Whatever that works.
414
00:22:03,990 --> 00:22:09,370
But, but we need to be aware that
it's not 100%, 100%, um, and, and,
415
00:22:09,370 --> 00:22:12,899
and that's just the knowledge and
the gift that God has given us.
416
00:22:13,099 --> 00:22:13,459
Yeah.
417
00:22:13,590 --> 00:22:14,000
Right.
418
00:22:14,110 --> 00:22:16,040
That, that God expects us.
419
00:22:16,524 --> 00:22:20,865
To use the minds that he's given us
and the way he's blessed us and said
420
00:22:21,095 --> 00:22:25,955
give me more information I'm not quite
there yet, and we can do that in love.
421
00:22:25,955 --> 00:22:28,355
It doesn't have to be a
confrontational thing or a mean
422
00:22:28,365 --> 00:22:30,714
thing Uh, but that's how we grow.
423
00:22:30,844 --> 00:22:35,784
That's how we all grow the way I can
understand why the gentleman resigned
424
00:22:35,785 --> 00:22:41,525
I can understand so can I so can I
because if i'm in a situation where
425
00:22:41,525 --> 00:22:47,940
I fundamentally disagree With what's
going on, and I cannot change it, or
426
00:22:47,980 --> 00:22:50,450
I feel like I'm not being listened to.
427
00:22:51,145 --> 00:22:53,705
I will, I will move to a
different, different area.
428
00:22:53,995 --> 00:22:55,095
I, I will walk away.
429
00:22:55,804 --> 00:22:56,094
Yeah.
430
00:22:56,324 --> 00:23:01,835
Um, uh, and, and I think that that's okay,
but, but we, we have to stay in power.
431
00:23:01,965 --> 00:23:02,825
That's what I'm trying to say.
432
00:23:03,064 --> 00:23:04,195
We don't want to be limits.
433
00:23:04,205 --> 00:23:07,035
We don't want to be led, you know,
this way, that way, that way.
434
00:23:07,995 --> 00:23:08,954
Any other questions?
435
00:23:09,185 --> 00:23:10,525
I just wanted to make a comment.
436
00:23:10,625 --> 00:23:15,475
Um, in my experience with it, I was
at my beautician and she's like, Oh,
437
00:23:15,505 --> 00:23:18,385
I can show you what you look like,
but like a different hairstyle.
438
00:23:18,395 --> 00:23:19,774
She a picture of you.
439
00:23:20,215 --> 00:23:25,164
So she took a picture on her phone and
then it made all of these images with
440
00:23:25,164 --> 00:23:28,244
my face and like put me in the body.
441
00:23:28,244 --> 00:23:29,195
So I'm going to show you.
442
00:23:29,195 --> 00:23:31,945
So this one of them, so I'm
not sure if you can see it.
443
00:23:31,945 --> 00:23:32,724
So this is what.
444
00:23:33,225 --> 00:23:38,294
She took my face and that's the image
that came out, but she could like post
445
00:23:38,294 --> 00:23:40,254
that someplace and say that was face.
446
00:23:40,254 --> 00:23:41,684
She was, you know what I mean.
447
00:23:41,684 --> 00:23:44,194
So that kind of scared me
because it was different.
448
00:23:44,344 --> 00:23:47,324
I mean, there's other hairstyles that
it showed and, you know, different
449
00:23:47,524 --> 00:23:53,304
clothes and it was like instant
that she made this, you know, right.
450
00:23:55,035 --> 00:23:58,305
Well, you know, that battle is being
fought right now in places like
451
00:23:58,305 --> 00:24:02,165
Hollywood and all the people who
are very artistic, who use, because
452
00:24:02,165 --> 00:24:03,454
that's, I didn't even go into that.
453
00:24:03,654 --> 00:24:09,154
That's a whole different area in terms of
the arts, you know, pictures and images,
454
00:24:09,655 --> 00:24:11,964
which are even more difficult to control.
455
00:24:12,174 --> 00:24:13,605
But that's clearly a part of it.
456
00:24:14,165 --> 00:24:18,514
You know, because someone could
hijack your, your image, you know,
457
00:24:18,514 --> 00:24:21,165
and say, is this the person, blah,
blah, blah, blah, blah, and I say,
458
00:24:21,165 --> 00:24:24,235
yeah, that's her, that's her, and
you don't know anything about that.
459
00:24:24,435 --> 00:24:26,665
And these are some of the things
that we, we have to correct
460
00:24:26,905 --> 00:24:28,595
and just, just to be aware of.
461
00:24:28,954 --> 00:24:30,805
Yeah, but that, that is
definitely a problem.
462
00:24:31,145 --> 00:24:34,114
And it's more difficult, I
think, to control than the words.
463
00:24:34,865 --> 00:24:39,795
Yeah, yeah, because I just, I didn't
know what I thought it was going to show,
464
00:24:40,375 --> 00:24:42,185
but I was just like, oh my goodness.
465
00:24:43,525 --> 00:24:44,315
Yeah, yeah.
466
00:24:44,554 --> 00:24:48,005
And probably the safest thing
is to just say, no, thank you.
467
00:24:48,035 --> 00:24:48,935
No, that's okay.
468
00:24:49,184 --> 00:24:53,034
I, I, it's great young people because they
take pictures all the time and selfies and
469
00:24:53,065 --> 00:24:54,915
this, they don't think anything about it.
470
00:24:55,145 --> 00:24:59,025
But that, that's where it's probably
just to slow it down a bit if possible.
471
00:24:59,215 --> 00:25:00,025
No, good, good point.
472
00:25:00,970 --> 00:25:04,540
Sure, sure, you don't even think
about it, because it's done so fast
473
00:25:04,540 --> 00:25:05,860
and you don't want to be the bad guy.
474
00:25:07,124 --> 00:25:12,614
Right, right, right, because
if they have no picture.
475
00:25:12,704 --> 00:25:13,614
Yeah, exactly.
476
00:25:13,954 --> 00:25:14,485
Exactly.
477
00:25:14,674 --> 00:25:15,274
Good point.
478
00:25:15,485 --> 00:25:17,704
Well, it's happening
right now in Hollywood.
479
00:25:18,214 --> 00:25:25,684
Um, the, the artist and, and the, uh,
the movie stars are realizing that
480
00:25:25,724 --> 00:25:33,584
out there, someone is posting pictures
of them that created, uh, with AI.
481
00:25:33,584 --> 00:25:35,455
It's not a real picture, but it is.
482
00:25:35,794 --> 00:25:36,995
But it looks so real.
483
00:25:36,995 --> 00:25:37,294
Right.
484
00:25:37,394 --> 00:25:38,284
And you can't tell.
485
00:25:38,304 --> 00:25:42,655
Like you say, they can, they
can place that image wherever,
486
00:25:43,104 --> 00:25:45,114
either to damage or to benefit us.
487
00:25:45,195 --> 00:25:48,495
So we have to be careful that
the risk is out there already.
488
00:25:48,745 --> 00:25:49,054
Right.
489
00:25:49,274 --> 00:25:49,964
And, and think about it.
490
00:25:49,964 --> 00:25:50,725
If you're an artist.
491
00:25:51,385 --> 00:25:56,035
And your livelihood is based on your
creativity, and someone steals your
492
00:25:56,035 --> 00:25:58,045
creativity and then embellishes it.
493
00:25:58,345 --> 00:26:01,804
They can put you out of
business, or your career is over.
494
00:26:02,435 --> 00:26:04,705
Because the public doesn't know, you know.
495
00:26:04,885 --> 00:26:07,475
And these are battles that
are being fought right now.
496
00:26:07,904 --> 00:26:10,685
And they haven't been resolved
yet, but they're being fought.
497
00:26:16,255 --> 00:26:17,825
Right, while they were demonstrated.
498
00:26:18,555 --> 00:26:19,325
Yeah, yeah.
499
00:26:22,705 --> 00:26:26,465
They can use their images and put them
in movies without even having them there.
500
00:26:26,770 --> 00:26:30,679
And we can tell the difference and they
do that to some extent, but they put
501
00:26:30,679 --> 00:26:34,909
some of the movies and make them look
younger, you know, and you said, no,
502
00:26:34,909 --> 00:26:39,749
that, that person that's old as I am,
how can they look like, how would they
503
00:26:39,750 --> 00:26:44,500
look that way, you know, all of this
is happening before I was very serious.
504
00:26:44,839 --> 00:26:45,129
Right.
505
00:26:45,159 --> 00:26:46,199
And they don't have to get paid for it.
506
00:26:46,449 --> 00:26:49,929
And that's what I mean, that the company
can just kick them out and says, no, no,
507
00:26:49,930 --> 00:26:55,120
no, we're going to go with the avatar,
you know, your avatar, we don't need you.
508
00:26:55,380 --> 00:26:56,630
We just use your avatar.
509
00:26:57,215 --> 00:26:57,735
Exactly.
510
00:26:58,145 --> 00:27:00,945
You know, uh, just a
word about the profiling.
511
00:27:01,515 --> 00:27:02,425
Aspect of it.
512
00:27:02,515 --> 00:27:02,895
Mm hmm.
513
00:27:03,225 --> 00:27:08,525
Uh, there was a movie called the
minority report Okay, it does that
514
00:27:08,565 --> 00:27:13,045
if you if you haven't watched that
that And I think it's like I don't
515
00:27:13,045 --> 00:27:14,765
know seven eight ten years ago.
516
00:27:14,925 --> 00:27:20,801
That was with tom cruise uh, believe
either him or or matt damon or Right.
517
00:27:20,801 --> 00:27:24,235
And it was one of those futuristic movies.
518
00:27:24,245 --> 00:27:29,655
Yes, and they would, it would arrest
people just on based on the information.
519
00:27:29,694 --> 00:27:30,134
Right.
520
00:27:30,445 --> 00:27:32,665
You know, five years from
now, you're going to do this.
521
00:27:32,675 --> 00:27:33,424
So, right, right.
522
00:27:33,425 --> 00:27:34,915
So they would arrest you ahead of time.
523
00:27:35,935 --> 00:27:38,985
And then the problem is because if you
get in the computer, then you can screw
524
00:27:38,985 --> 00:27:41,605
it up and make people get arrested who.
525
00:27:42,040 --> 00:27:42,850
Shouldn't get arrested.
526
00:27:42,880 --> 00:27:45,490
I think that was part of the whole plot.
527
00:27:45,520 --> 00:27:45,720
Yeah.
528
00:27:45,970 --> 00:27:46,270
Yeah.
529
00:27:46,910 --> 00:27:47,220
Right.
530
00:27:47,980 --> 00:27:51,600
And, and again, I, you know, I, this
is my opinion and I'll stop with it.
531
00:27:51,790 --> 00:27:56,949
I think, you know, God sets up
things a certain way for a reason.
532
00:27:57,420 --> 00:28:00,869
Now we, we humans try to understand
it and I'm a scientist and I try
533
00:28:00,869 --> 00:28:05,610
to understand it, but at the end
of the day, you know, we may not
534
00:28:05,640 --> 00:28:07,190
understand it a hundred percent.
535
00:28:07,720 --> 00:28:11,700
Um, and a lot of times when we try to
understand it so much, we screw it up.
536
00:28:12,120 --> 00:28:13,110
We screw it up.
537
00:28:13,510 --> 00:28:15,900
Uh, but, but this is where we are now.
538
00:28:16,080 --> 00:28:17,280
This is where we are now.
539
00:28:17,530 --> 00:28:20,170
And I think all of us,
Christians and everybody else,
540
00:28:20,390 --> 00:28:21,759
we need to be knowledgeable.
541
00:28:22,060 --> 00:28:23,219
We need to be knowledgeable.
542
00:28:23,219 --> 00:28:24,160
We need to study.
543
00:28:24,170 --> 00:28:25,110
We need to read.
544
00:28:25,439 --> 00:28:26,589
We need to understand.
545
00:28:26,590 --> 00:28:32,400
We need to teach, educate, et cetera, so
we can lead in a proper, a proper way.
546
00:28:32,600 --> 00:28:33,560
Any other slides?
547
00:28:33,900 --> 00:28:34,900
I mean, any other questions?
548
00:28:35,625 --> 00:28:38,064
I know it's harsh, indeed.
549
00:28:40,045 --> 00:28:40,525
Yes, sir.
550
00:28:41,014 --> 00:28:43,414
Pastor Jose Rivas: Not a question,
but I think it would be good to
551
00:28:43,424 --> 00:28:46,344
bring this topic, uh, into church.
552
00:28:47,635 --> 00:28:53,504
So, because, I mean, once, uh,
I have to tell Siri to shut up
553
00:28:53,504 --> 00:28:53,804
Dr William T. Choctaw: here.
554
00:28:53,825 --> 00:28:56,874
Ha ha.
555
00:28:57,534 --> 00:28:58,774
So your phone is going off.
556
00:28:58,814 --> 00:28:59,324
I was like, yeah.
557
00:29:01,314 --> 00:29:02,495
They're just watching you speak, man.
558
00:29:02,495 --> 00:29:03,994
Yeah,
559
00:29:05,675 --> 00:29:07,754
Pastor Jose Rivas: because of
the, you know how the end times
560
00:29:07,754 --> 00:29:09,175
are, I mean, are coming in.
561
00:29:09,475 --> 00:29:13,604
And who knows, this might be
babble hour, or babble problem.
562
00:29:13,804 --> 00:29:18,694
This thing, you know, bringing to a
point so we can be recognized in such
563
00:29:18,694 --> 00:29:20,404
a way that it's our own distraction.
564
00:29:20,814 --> 00:29:22,124
I mean, that's going to the limit.
565
00:29:22,134 --> 00:29:27,410
But what I'm saying is it's important
for us to know to So how to teach in the
566
00:29:27,410 --> 00:29:31,980
church, because we can go to the fanatic
side or the unrealistic side, but we
567
00:29:31,990 --> 00:29:38,140
have to be knowledgeable on that too,
because it will, it will involve us.
568
00:29:38,400 --> 00:29:39,769
And we are involved already.
569
00:29:39,950 --> 00:29:43,109
I mean, it's either with
knowledge or without it.
570
00:29:43,549 --> 00:29:45,939
Dr William T. Choctaw: This
thing, you know, pastors, I
571
00:29:46,040 --> 00:29:47,229
think that's an excellent point.
572
00:29:47,350 --> 00:29:49,659
I, I, I, I, and this is just me now.
573
00:29:49,970 --> 00:29:56,390
I, I think as a Christian, the way I
evangelize, Um, is I, again, I think you
574
00:29:56,390 --> 00:30:01,490
go where people are, you know, and if
we're concerned about losing people, ask
575
00:30:01,490 --> 00:30:03,130
yourself, well, why is that happening?
576
00:30:03,510 --> 00:30:06,130
And I'm going to submit
it's not always their fault.
577
00:30:06,479 --> 00:30:10,860
Maybe, just maybe, we Christians
can do a little better.
578
00:30:11,109 --> 00:30:16,310
Maybe, to your point, we can reach out
to them and say, you know what, we're
579
00:30:16,310 --> 00:30:17,780
going to have a discussion about AI.
580
00:30:18,475 --> 00:30:22,945
Because we recognize blah, blah, blah,
blah, blah, blah, blah, blah, that maybe
581
00:30:23,205 --> 00:30:26,045
that, that people say, well, you know,
I'm going to start going to that church
582
00:30:26,045 --> 00:30:29,444
because I, I've been worried about
this stuff and, or whether it's don't
583
00:30:29,445 --> 00:30:35,164
make a mess or whatever, that the more
relevant we are to their life allows
584
00:30:35,164 --> 00:30:39,015
them to maybe to pay more attention to
what we want to communicate to them.
585
00:30:39,540 --> 00:30:43,500
But, but if they don't believe that
we're relevant to their life, they,
586
00:30:43,510 --> 00:30:46,850
they may say, well, you know, I've
got so much to do over here, I, I,
587
00:30:46,850 --> 00:30:48,410
I'm sorry, but thank you very much.
588
00:30:48,660 --> 00:30:50,650
And so, to your point,
I, I agree with you.
589
00:30:50,680 --> 00:30:54,060
I think the more we can do, uh,
and hence, one of the reasons
590
00:30:54,060 --> 00:30:57,990
for these masterclasses, uh,
is to say, this is real world.
591
00:30:58,210 --> 00:30:59,180
We understand that.
592
00:30:59,190 --> 00:31:00,190
We all live in it.
593
00:31:00,210 --> 00:31:02,170
This is how we get through it.
594
00:31:02,510 --> 00:31:05,110
How can we help you to get
through it, type thing.
595
00:31:05,540 --> 00:31:09,000
Um, and that that may be one
of the best ways to evangelize.
596
00:31:09,430 --> 00:31:14,500
Other than standing up high and speaking
down low, uh, to folks about what they
597
00:31:14,500 --> 00:31:17,100
should and should not do, just my opinion.
598
00:31:17,340 --> 00:31:21,220
Any other questions, comments,
pastor Vis, would you close us
599
00:31:21,220 --> 00:31:22,420
with the word of prayer, please?
600
00:31:22,600 --> 00:31:23,110
Pastor Jose Rivas: Yeah, heaven.
601
00:31:23,115 --> 00:31:27,640
Father, we thank you for this day,
Lord God in the word on me, 29 29.
602
00:31:27,640 --> 00:31:32,500
It says that the things hidden
belongs to you and the one
603
00:31:32,500 --> 00:31:34,780
that reveals our for ourself.
604
00:31:35,665 --> 00:31:36,860
And Father, we thank you.
605
00:31:37,085 --> 00:31:39,225
For even science belongs to you.
606
00:31:39,405 --> 00:31:43,725
You are the all knowledge,
omniscient God whom we serve.
607
00:31:44,585 --> 00:31:47,105
Therefore, Lord God, we thank
you for this opportunity.
608
00:31:47,285 --> 00:31:48,405
Thank you for Dr.
609
00:31:48,445 --> 00:31:51,604
Chao Tao and the group, Lord
God, that make this possible.
610
00:31:53,255 --> 00:31:58,505
Bring us that information so we can
be better prepared for Whatever you
611
00:31:58,505 --> 00:32:05,435
have us to do today and in the near
future, and in the future way beyond.
612
00:32:05,675 --> 00:32:11,235
Let us be the light of this world and
help us be the salt, your precious name.
613
00:32:11,240 --> 00:32:13,725
We ask, and we thank you for everything.
614
00:32:13,995 --> 00:32:14,535
Amen.
615
00:32:14,805 --> 00:32:15,165
Amen.
616
00:32:15,170 --> 00:32:15,450
Dr William T. Choctaw: Amen.
617
00:32:15,450 --> 00:32:15,730
Amen.
618
00:32:15,800 --> 00:32:16,090
Amen.
619
00:32:16,560 --> 00:32:16,850
Amen.
620
00:32:17,145 --> 00:32:17,505
Thank you.
621
00:32:17,680 --> 00:32:19,890
Bye-Bye bye.
622
00:32:27,740 --> 00:32:28,090
Uhhuh.