AXSChat Podcast

AXSChat Podcast with Esther Dakin-Poole

September 25, 2020 Antonio Santos, Debra Ruh, Neil Milliken talk with Esther Dakin-Poole
AXSChat Podcast with Esther Dakin-Poole
AXSChat Podcast
More Info
AXSChat Podcast
AXSChat Podcast with Esther Dakin-Poole
Sep 25, 2020
Antonio Santos, Debra Ruh, Neil Milliken talk with Esther Dakin-Poole


Hosted by Antonio Santos, Debra Ruh and Neil Milliken. 

Esther Dakin-Poole is a Council member and Coordinator of the current UK Assistive Products List (UK APL) initiative for The British Assistive Technology Association (BATA) and WHO-GATE Programme, designed to establish which are the most essential assistive products in the UK today. Esther is a Director and Secretary of GAATO, The Global Alliance of Assistive Technology Organizations based in Geneva, Switzerland. She is also Head of Education & Development at Smile Smart Technology, her family’s enterprise who innovate and provide assistive technology to children and adults with complex disabilities and specialise in neuro disability switching, controls and linkages between digital access and mobility training. She has grown up as a carer for her younger sister who has Downs Syndrome and sees her as one of the most enlightening people in her life. Esther’s alma mater is SOAS, University of London where she most recently studied international diplomacy and she has historic regional focus on East and Southern Africa and South-East Asia.

Support the Show.

Follow axschat on social media
Twitter:

https://twitter.com/axschat
https://twitter.com/AkwyZ
https://twitter.com/neilmilliken
https://twitter.com/debraruh

LinkedIn
https://www.linkedin.com/in/antoniovieirasantos/
https://www.linkedin.com/company/axschat/

Vimeo
https://vimeo.com/akwyz




AXSChat Podcast +
Help us continue making great content for listeners everywhere.
Starting at $3/month
Support
Show Notes Transcript


Hosted by Antonio Santos, Debra Ruh and Neil Milliken. 

Esther Dakin-Poole is a Council member and Coordinator of the current UK Assistive Products List (UK APL) initiative for The British Assistive Technology Association (BATA) and WHO-GATE Programme, designed to establish which are the most essential assistive products in the UK today. Esther is a Director and Secretary of GAATO, The Global Alliance of Assistive Technology Organizations based in Geneva, Switzerland. She is also Head of Education & Development at Smile Smart Technology, her family’s enterprise who innovate and provide assistive technology to children and adults with complex disabilities and specialise in neuro disability switching, controls and linkages between digital access and mobility training. She has grown up as a carer for her younger sister who has Downs Syndrome and sees her as one of the most enlightening people in her life. Esther’s alma mater is SOAS, University of London where she most recently studied international diplomacy and she has historic regional focus on East and Southern Africa and South-East Asia.

Support the Show.

Follow axschat on social media
Twitter:

https://twitter.com/axschat
https://twitter.com/AkwyZ
https://twitter.com/neilmilliken
https://twitter.com/debraruh

LinkedIn
https://www.linkedin.com/in/antoniovieirasantos/
https://www.linkedin.com/company/axschat/

Vimeo
https://vimeo.com/akwyz




WEBVTT

1
00:00:02.669 --> 00:00:11.610
Neil Milliken: Hello and welcome to access chat. I'm delighted that we're being joined today by yesterday can pull, who is doing some work with the British Assistive Technology Association and gates.

2
00:00:12.509 --> 00:00:21.660
Neil Milliken: About building an assistant product list. So can you tell us a little bit more about your work and how this came into being, please.

3
00:00:22.770 --> 00:00:31.770
Esther Dakin-Poole: Hi, Neil. Yeah, thank you very much for having me on. And this is the product list is a tool device by the World Health Organization gate program.

4
00:00:32.580 --> 00:00:44.250
Esther Dakin-Poole: The global cooperation on assistive technology. And the idea is that we as a nation replicates a template that was devised on a global scale.

5
00:00:45.000 --> 00:00:51.420
Esther Dakin-Poole: In order to establish that there are essential items. There are essential assistive products that

6
00:00:51.840 --> 00:01:01.080
Esther Dakin-Poole: are ubiquitous around the world that should be considered essential very much following the model of the essential medicines list that's been so successful in

7
00:01:01.470 --> 00:01:09.900
Esther Dakin-Poole: Making medicine so available around the world. And the idea was that there would be a list drawn up by global players.

8
00:01:10.740 --> 00:01:23.400
Esther Dakin-Poole: In the world to decide which which products they felt if there were any and, indeed, there were that should be essential, including spectacles hearing aids.

9
00:01:24.270 --> 00:01:37.920
Esther Dakin-Poole: Forms of crutches forms of wheelchairs, grab bars accessibility tools for speech to text and and the list is is is is extensive they decided to cap it

10
00:01:39.210 --> 00:01:48.810
Esther Dakin-Poole: At 100 items that were on the survey and then decide upon say 50 that were the final list that they felt were currently

11
00:01:49.560 --> 00:02:00.150
Esther Dakin-Poole: Essential or they've turned the use the word priority because according to policy, they can't use the word essential yet because there isn't the data.

12
00:02:00.570 --> 00:02:11.580
Esther Dakin-Poole: So part of this tool is to find the data so that we can then help them use that word essential and say that these are deemed essential items.

13
00:02:12.030 --> 00:02:22.530
Esther Dakin-Poole: To every individual in the world. And ideally, they would have access to them by whatever form that means through procurement of government procurement or individual procurement, but

14
00:02:23.310 --> 00:02:30.600
Esther Dakin-Poole: Through some format that's not for this conversation. And so that template is then taken by countries.

15
00:02:31.140 --> 00:02:40.950
Esther Dakin-Poole: To them device their own format for their own culture for their own country be at mountainous be sandy hot, cold.

16
00:02:41.430 --> 00:02:52.800
Esther Dakin-Poole: And Rich or poor and deciding, then, about which might be their essential item product essential essential assistive products within then national

17
00:02:53.280 --> 00:03:05.190
Esther Dakin-Poole: Setup. And so we've taken that and we've turned it into a digital template and put that online. There's also a PDF at the moment. And it's very new. The survey.

18
00:03:05.940 --> 00:03:17.880
Esther Dakin-Poole: But we've garnered enormous traction already from stakeholders around the UK who also think that having been introduced to this idea of an essential

19
00:03:18.330 --> 00:03:30.510
Esther Dakin-Poole: Assistive products lists that this is something that could actually help bring us together and find out what items. There are that that should be highlighted as being essential

20
00:03:32.010 --> 00:03:42.870
Esther Dakin-Poole: So that's what the British Assistive Technology Association have undertaken. They were invited to carry this workout with the who gate program.

21
00:03:44.130 --> 00:03:54.810
Esther Dakin-Poole: Because there are a very central organization, they are not distinctly healthcare or education or workplace, they are

22
00:03:55.230 --> 00:04:15.150
Esther Dakin-Poole: Pretty much what it says on the tin there. The British Assistive Technology Association and they're almost universally we're all volunteers. So there is no financial incentive for us to be doing this and we just, we believe in bringing as many members of the UK at ecosystem together.

23
00:04:16.350 --> 00:04:31.020
Esther Dakin-Poole: To start this conversation we're only the third country to undertake this. I think there are now 10 in progress, around the world, which is really exciting and Britain or the first high income country to take this on.

24
00:04:32.700 --> 00:04:33.180
Esther Dakin-Poole: And

25
00:04:34.200 --> 00:04:34.530
Esther Dakin-Poole: Yeah.

26
00:04:34.620 --> 00:04:34.980
Neil Milliken: I've got a

27
00:04:35.370 --> 00:04:36.720
Neil Milliken: Question. I know that that

28
00:04:36.810 --> 00:04:47.520
Neil Milliken: Deborah's also got one she wants to follow up on. It's interesting that you say we were the first high income country what what pleases me

29
00:04:48.330 --> 00:04:54.390
Neil Milliken: And what I'd like to highlight is that low income countries are actually leading on this. So can you tell us

30
00:04:55.200 --> 00:05:08.940
Neil Milliken: Who are these forward thinking low income countries because we really do need to give credit to to the global south for sometimes leading on accessibility, which we often forget to do

31
00:05:09.930 --> 00:05:10.980
Esther Dakin-Poole: You're absolutely right.

32
00:05:12.060 --> 00:05:16.260
Esther Dakin-Poole: So the first to undertake any PL was Tajikistan.

33
00:05:17.460 --> 00:05:27.660
Esther Dakin-Poole: And again, that was led by the WHO office there and the government and the second was in Nepal, and I think this is very much a that's a very, very good point.

34
00:05:28.200 --> 00:05:43.020
Esther Dakin-Poole: Because as you establish a new idea, your Government takes on a new policy objective because they see what a difference that he can make as a right across the spectrum across so many ministries

35
00:05:44.070 --> 00:05:53.940
Esther Dakin-Poole: You actually when you're starting from fresh there is very little there. So actually you can build using the latest knowledge, the latest technology. The latest research.

36
00:05:54.300 --> 00:06:04.080
Esther Dakin-Poole: So it's like traveling to Taiwan, they have the fastest trains. They have the snazzier stations. It's all brand new. It's incredibly accessible.

37
00:06:04.470 --> 00:06:21.900
Esther Dakin-Poole: And we have the London Underground, which is awesome but you know it was built over 100 years ago and it's not desperate accessible to us today, but we can't build a whole new one. And I think the same analogy can be put forward for our ecosystem. It's we are

38
00:06:22.860 --> 00:06:29.520
Esther Dakin-Poole: We would love to be able to start from fresh. But actually, we just need to we need to reject what we already have.

39
00:06:30.540 --> 00:06:38.040
Esther Dakin-Poole: But the lovely thing, as you said, is that so many other countries who aren't in our financial situation can see the financial benefits.

40
00:06:38.580 --> 00:06:50.400
Esther Dakin-Poole: And the social benefits of instilling and taking on this policy initiative. So yeah, well done for highlighting it very good point. Very good point and Deborah, you had a question.

41
00:06:51.000 --> 00:06:57.840
Debra Ruh: Yes and it's coming from the states and sometimes we complicate things and so

42
00:06:58.650 --> 00:07:09.240
Debra Ruh: I know that we've had list we have list. We actually had this gigantic database that we were maintaining in the States, and it had thousands and thousands of

43
00:07:09.630 --> 00:07:19.350
Debra Ruh: Assistive technologies. And one thing that I've seen this sort of misuse, if that's even the right word. I want to use, but is we

44
00:07:19.920 --> 00:07:24.870
Debra Ruh: And I know this is different from what you're saying, but I'm just thinking about how you know I'm just

45
00:07:25.530 --> 00:07:32.820
Debra Ruh: I just am thinking about this perspective of it. And so we had corporations, for example, stepping up

46
00:07:33.150 --> 00:07:41.790
Debra Ruh: This was years ago this was 578 years ago and they were saying we're only going to support this kind of assistive technology, we used to do this for example.

47
00:07:42.150 --> 00:07:53.550
Debra Ruh: We're all you know our website works best on this kind of browser, we used to say that years ago. And so it was sort of like that. Will that that we recommend the assistive technology tools for you to use.

48
00:07:53.880 --> 00:08:02.220
Debra Ruh: Are these and then they would name certain certain vendors and so other vendors came forward and felt that was very unfair. So when

49
00:08:02.640 --> 00:08:13.080
Debra Ruh: I agree with the essential list but I worry about how some people might misuse it and only think that, for example, one specific screen reader is better than others, or

50
00:08:13.740 --> 00:08:17.610
Debra Ruh: You know, or you you name 50 assistive technology.

51
00:08:18.180 --> 00:08:28.260
Debra Ruh: Tools products solutions, but of course we have thousands, because we have so many different kinds of disabilities. So how do you manage this in that funny thing to say.

52
00:08:28.560 --> 00:08:44.280
Debra Ruh: And not have people actually misuse the information and say no we're not saying these are the only 50 assistive technology tools that you can use it sort of like that. And I'm sorry to speak from that perspective, but I am in the United States right now. Also, we are

53
00:08:45.090 --> 00:08:52.590
Esther Dakin-Poole: I think it could really fair question. And it's a question that I heard right at the beginning of undertaking this project.

54
00:08:54.150 --> 00:09:08.010
Esther Dakin-Poole: Firstly, we're very fortunate that we live in, in the UK, and we have a National Health Service, and I must say that that's obviously as a tool that's being hopefully emboldening an entire ecosystem.

55
00:09:09.450 --> 00:09:20.670
Esther Dakin-Poole: Is isn't meant to actually open up the strength of the ecosystem and not narrow it down the the list does not have generic items on it.

56
00:09:21.600 --> 00:09:38.580
Esther Dakin-Poole: So it, it has it only has generic item. Sorry. It doesn't have specific branded items. These are about you ubiquitous items. These are about general items we all wear spectacles. There are two forms of spectacles.

57
00:09:39.180 --> 00:09:57.060
Esther Dakin-Poole: Long sighted short sighted, actually, there may be three. I think there's third which is for prisons, I have prisons. So I would tick that box on it, but that's a specific as you get they all have an ISO number. So they're classified by ISO coding, not by brand.

58
00:09:58.350 --> 00:10:08.460
Esther Dakin-Poole: And if if they are deemed essential generally you'd imagine that they'd be needed by a very large number of people. So if they're already

59
00:10:09.060 --> 00:10:21.420
Esther Dakin-Poole: Being used in the UK context, for example, and they're deemed essential, they're not a new invention. That's only on the market by one company and nobody else has ever heard of them. So

60
00:10:22.350 --> 00:10:30.780
Esther Dakin-Poole: They're the who obviously have incredibly strict guidelines about working with industry and for the right reasons.

61
00:10:32.160 --> 00:10:43.440
Esther Dakin-Poole: But part of the discussions that we had the best at the beginning was that, how do we decide who can put forward that items for consideration on the list.

62
00:10:43.770 --> 00:10:55.290
Esther Dakin-Poole: Do we include all industry do we exclude anybody and I, you know, we felt I felt very strongly that this has to be open to everybody to have their opinion.

63
00:10:55.830 --> 00:11:06.270
Esther Dakin-Poole: Because I everyone is part of that ecosystem. You cannot say that business be there big business, small business or bad, they shouldn't be excluded.

64
00:11:07.290 --> 00:11:16.530
Esther Dakin-Poole: And everybody needs to have their part and their say within that I think the the essence of an API API is not an. It's not a means

65
00:11:17.280 --> 00:11:27.360
Esther Dakin-Poole: To itself. It's a means to an end, and that is to try to unify the whole of the ecosystem. It is about bringing together every single facet of it.

66
00:11:27.840 --> 00:11:33.690
Esther Dakin-Poole: Be the people involved in prosthetics and orthotics hearing mobility.

67
00:11:34.500 --> 00:11:49.140
Esther Dakin-Poole: Cognition, the practitioners the therapists, it is vast, vast, when you think about the number of people involved organizations involved within at as a whole, not just specific products but at that services as well.

68
00:11:50.100 --> 00:12:10.680
Esther Dakin-Poole: And part of that is to bring all of those people together all of those stakeholders, so that not just industry, not just insurance can then take hold of that information. This is about bringing together a very, very large body of people as a voice as a single voice.

69
00:12:11.790 --> 00:12:30.150
Esther Dakin-Poole: So yes, there is always there always is a chance I I imagine that these objectives can be misused. I'm, I'm, I'm not a government. So, but that that that's as much as I would hope to be able to reassure you, from the perspective of misuse of the essential list.

70
00:12:31.410 --> 00:12:36.330
Debra Ruh: That's amazing. That's an amazing answer. Thank you. Antonio, I know you had a question.

71
00:12:37.980 --> 00:12:41.370
Antonio Santos: Was reflecting on on esters and

72
00:12:42.420 --> 00:12:54.210
Antonio Santos: And that would, you know, looking at two sides and two questions more. One is, let's say, you know, today there's so much innovation started developing prosthetics.

73
00:12:55.320 --> 00:12:58.290
Antonio Santos: Some birds sometimes just because they feel

74
00:12:59.280 --> 00:13:08.940
Antonio Santos: Students out of university. They feel that this is the right thing to do. I want to make a mark I want to create. You know, I really want to make a change and make a product more widely available to everyone.

75
00:13:09.390 --> 00:13:17.100
Antonio Santos: And how can someone that on these tapes and know can can engage and have that product considered

76
00:13:18.150 --> 00:13:31.710
Antonio Santos: And the other is somehow link it but that there's also products out there in the market, we're not necessarily built as assistive tech, but they have became part of assistive tech. Oh, are you looking at them as well.

77
00:13:32.970 --> 00:13:34.050
Esther Dakin-Poole: Very good questions.

78
00:13:34.890 --> 00:13:43.170
Esther Dakin-Poole: I, in the first instance for those students or innovators who are within the university setting again.

79
00:13:43.800 --> 00:13:57.660
Esther Dakin-Poole: Referring to the comment that I made with to Deborah's question. It is not about individual items and individual innovations. This is about a generic item that is very well established on the market.

80
00:13:58.890 --> 00:14:13.620
Esther Dakin-Poole: Because we have in in a high income country with a system like the UK, for example, or Portugal or the States there are 10s of thousands, if not more items available.

81
00:14:14.160 --> 00:14:22.770
Esther Dakin-Poole: So this is about defining the items that already exist. This is not about the ones that haven't been invented yet or I've only just been invented.

82
00:14:23.160 --> 00:14:35.040
Esther Dakin-Poole: This is about those that are utterly proven and that we know are essential, but we're actually putting it on paper for the first time, the idea is that the list would then be refreshed.

83
00:14:35.580 --> 00:14:53.460
Esther Dakin-Poole: Every couple of years. So every two to three years we come together as an entire system all over again. And we collect the same data and we revise that we discuss it and if then innovation has moved forward researchers move forward and we've decided that

84
00:14:54.570 --> 00:15:07.140
Esther Dakin-Poole: You know there's been some revolutionary new widget that's been invented. And it's changed the world that's developed designed today, it might be essential in six to 10 years time.

85
00:15:08.760 --> 00:15:19.830
Esther Dakin-Poole: But, and hopefully by gathering this information in this data, we can make it clear where the gaps are in innovation. We can find out the barriers to

86
00:15:20.610 --> 00:15:34.200
Esther Dakin-Poole: To finding that to procuring and accessing the products that already exist so that then if there are gaps. Those the things that can be innovated those the things that can be funded.

87
00:15:34.680 --> 00:15:54.390
Esther Dakin-Poole: So that university students and academic departments can invent those not just reinventing things we've already got, you know, it's, it's, um, it's really important that we don't just have more money thrown at the wrong places. And, and so the question is, the, the, the answer your second question.

88
00:15:55.980 --> 00:15:56.100
Esther Dakin-Poole: Um,

89
00:15:57.300 --> 00:15:58.170
Esther Dakin-Poole: The

90
00:15:59.520 --> 00:16:05.220
Esther Dakin-Poole: This the system itself is is it is a strong one and

91
00:16:06.930 --> 00:16:11.130
Esther Dakin-Poole: I think I was just looking at the chat so suit Neil's disappeared off.

92
00:16:13.140 --> 00:16:16.380
Esther Dakin-Poole: Could you could you elaborate upon that second question, and Tony

93
00:16:16.380 --> 00:16:34.560
Antonio Santos: So sometimes we seen that new tech the user experience of new technology is easy, improve it a lot and sometimes products that are more designed for mass market that consumer market and that being

94
00:16:35.580 --> 00:16:53.850
Antonio Santos: Used on assistive tech, even if they are not necessarily designed for that, but they never ever an important tool I can give you an example of Alexa or voice devices that was not necessarily with that in mind, but they end up being very helpful for some people.

95
00:16:54.210 --> 00:16:55.950
Esther Dakin-Poole: Absolutely. So I remember

96
00:16:56.250 --> 00:16:57.540
Esther Dakin-Poole: Listening to somebody

97
00:16:58.200 --> 00:17:11.130
Esther Dakin-Poole: At one of the select committees in the British government saying it's great that as Alexa has been designed now, but we could have done with it 10 years ago. And this was a lady who who was in a wheelchair and

98
00:17:12.060 --> 00:17:20.850
Esther Dakin-Poole: She was very profoundly disabled and had no use of her arms and it would have been very useful to her for her to have an Alexa, a long time ago.

99
00:17:21.360 --> 00:17:38.610
Esther Dakin-Poole: I think on on the list. For example, in the UK, there are many devices there which could be superseded by a small phone and a smartphone, because the technology is there now, it probably wasn't there five years ago.

100
00:17:39.840 --> 00:17:51.210
Esther Dakin-Poole: But we can't put, you know, we can't actually say that there's one device that actually should be available and essential, but I think part of this conversation is to say that globally.

101
00:17:52.080 --> 00:18:03.810
Esther Dakin-Poole: It's not sensible to be able to say that everybody needs to have a smartphone. You know, if you can't buy a loaf of bread, you're certainly not going to be able to buy a smartphone in Eastern Africa.

102
00:18:04.620 --> 00:18:12.630
Esther Dakin-Poole: But in the UK, where an awful lot of people can only afford them. It's a good way to highlight that this technology does exist.

103
00:18:13.080 --> 00:18:33.210
Esther Dakin-Poole: And that may be five of the items could be superseded on the current list with maybe one item that we then put forward as being something which is is beneficial. So again, it was an app scale item, as you said, not designed specifically as an 80 device, but has now had an awful lot of

104
00:18:35.550 --> 00:18:42.780
Esther Dakin-Poole: Investment put into it to make it accessible, especially for those with VI and hearing issues.

105
00:18:43.740 --> 00:18:51.870
Antonio Santos: I can tell the story that I work in, in the telecom sector for for a few years and I remember know

106
00:18:53.190 --> 00:19:04.620
Antonio Santos: Around 2000 more than that. No, we were providing services and we have users who are coming to us with the with the phone where there was a piece of software for assistive tech

107
00:19:05.220 --> 00:19:18.600
Antonio Santos: And that that software was link it to that specific phone. So if the phone was being replaced. If there was giving, giving a new they were almost forced to buy a new license for that product. So that's where we were.

108
00:19:19.860 --> 00:19:32.850
Antonio Santos: I think we already evolved. So what I'll do you see the changes in this market over the last couple of years, and began where assistive device definitely become more affordable.

109
00:19:33.780 --> 00:19:39.330
Esther Dakin-Poole: I think it's incredibly exciting, both in this country and globally. I think that the

110
00:19:40.350 --> 00:19:59.010
Esther Dakin-Poole: The capacity to be able to communicate as we're discovering and covert is one of the most remarkable and exciting developments within tech. Unfortunately, the digital divide is is growing larger for those who now can afford the smart devices, for example.

111
00:20:00.090 --> 00:20:05.670
Esther Dakin-Poole: It's fantastic. You know there's there's a lot there. But if you can't afford a smartphone.

112
00:20:07.620 --> 00:20:17.400
Esther Dakin-Poole: And you don't have access to one or you could afford one but you physically don't have the intellectual or psychosocial capacity to to interact with one

113
00:20:18.060 --> 00:20:32.370
Esther Dakin-Poole: Then you are pretty much excluded and and I, again, I'm not speaking on his behalf, but on my own personal concerns, even on a global perspective, I think that is a massive issue that needs to be considered.

114
00:20:33.600 --> 00:20:38.070
Esther Dakin-Poole: Because the digital divide is now could potentially grow larger

115
00:20:39.510 --> 00:20:49.110
Esther Dakin-Poole: If you my sister, for example, has Down syndrome. She has a phone, she can answer it. And it's taken several years to teach her how to use FaceTime

116
00:20:49.980 --> 00:21:03.270
Esther Dakin-Poole: But essentially, the speech to text, a lot of those devices that you think will be available. If you can't speak because you don't have a voice or your tongue doesn't work the way that Siri wants to hear it.

117
00:21:03.720 --> 00:21:11.640
Esther Dakin-Poole: You don't have access to that digital technology. So yes, it's fantastic, and encode we've discovered that it is, you know, brilliant.

118
00:21:12.480 --> 00:21:22.590
Esther Dakin-Poole: But I think there is there is a big problem that we are so focused on what we as particularly younger generations think is great and we all we've grown up with it.

119
00:21:23.130 --> 00:21:31.890
Esther Dakin-Poole: With those who may be a elders who'd never had it and they don't particularly want to grasp how to use it and those who physically can't

120
00:21:32.610 --> 00:21:41.100
Esther Dakin-Poole: At input devices as well. There's a paucity of input devices switches switching for those with a see users.

121
00:21:41.520 --> 00:21:48.030
Esther Dakin-Poole: There's, there are fantastic gadgets at the at the other end, but the actual input devices really haven't changed much in

122
00:21:48.480 --> 00:21:54.270
Esther Dakin-Poole: Around 2030 years it's still a switch of some form, but a very basic button switch

123
00:21:54.750 --> 00:22:04.650
Esther Dakin-Poole: That's where you know we we can have these conversations, we can say, fine. We love a digital device. But how are these people who are excluded going to access them.

124
00:22:05.070 --> 00:22:14.190
Esther Dakin-Poole: And then if we can find the barriers and we can define the gaps. That's where we know that we need to fix the system. And we can ask for funding and often asked for innovation.

125
00:22:16.110 --> 00:22:25.440
Debra Ruh: I my daughter has Down syndrome. She's 33 and she lives on her own in a supportive department and she uses Alexa and that way and sometimes

126
00:22:25.800 --> 00:22:33.180
Debra Ruh: It is difficult for Alexa to understand my I'm going to just say my daughter's Down syndrome exit. I'll just say it like that.

127
00:22:33.480 --> 00:22:40.620
Debra Ruh: But, um, but one thing I do like about what I've seen. Alexa do is the divide the system.

128
00:22:40.980 --> 00:22:49.770
Debra Ruh: I mean, the AI is learning and does learn my daughter's a speech patterns that doesn't mean networks across the board. It certainly is not working with my husband.

129
00:22:50.040 --> 00:23:04.470
Debra Ruh: Who is lost his communications due to dementia. So I totally agree with you that it's a problem. I also wanted to just comment on it they answer you were giving to to Antonio was so brilliant very impressed with the way you answer the

130
00:23:04.470 --> 00:23:13.500
Debra Ruh: Questions, but of course connectivity is a huge issue as well because we can have the devices. But if we don't have reliable connectivity for those devices.

131
00:23:13.740 --> 00:23:21.360
Debra Ruh: That and that doesn't do us any good. So as you were talking about yesterday the digital divide, and the fear of US widening the digital divide. I think

132
00:23:21.570 --> 00:23:25.380
Debra Ruh: Connectivity has to be considered more in that conversation, too.

133
00:23:25.650 --> 00:23:35.790
Debra Ruh: But I also wanted to comment that I have seen just tremendous and I know everyone will agree with this tremendous lack of data on how people are using assistive technology.

134
00:23:36.090 --> 00:23:43.050
Debra Ruh: Across the world tremendous lack lack lack of data. And so I think this is so important, and I love

135
00:23:43.380 --> 00:23:52.290
Debra Ruh: That, that, of course, the UK stepped up because we keep seeing the UK really step up in this field so much I I applaud the UK in that effort.

136
00:23:52.740 --> 00:24:01.920
Debra Ruh: But I love is as new was saying that it was started by lower developed, you know, developing countries and the Global South because

137
00:24:02.280 --> 00:24:14.820
Debra Ruh: There's so much we can learn from that. So I just love how the decks are doing this. And I just want to applaud the efforts, because if we really had this really rich data.

138
00:24:15.270 --> 00:24:31.980
Debra Ruh: There's so much we all can learn from it and I was telling somebody the other day. It's like creating these really cool assistive technologies like the echo skeleton system which is amazing. It's wonderful. But if somebody that needs it really bad can't get access to

139
00:24:31.980 --> 00:24:41.970
Debra Ruh: It, it almost breaks their heart a little bit you know so i think is society we need this data so bad. So I want to applaud the efforts are making Aster

140
00:24:42.870 --> 00:24:48.450
Esther Dakin-Poole: Thank you. That's very kind, when it's it's any thanks to the who, for devising the tools.

141
00:24:49.530 --> 00:24:56.760
Esther Dakin-Poole: The rest of us. No, I don't think anybody in our industry across the vast divide of different disciplines.

142
00:24:58.620 --> 00:25:05.250
Esther Dakin-Poole: Would would say that if we don't need the data that nothing needs to fix it. You know, we all get up in the morning to do good.

143
00:25:05.700 --> 00:25:12.540
Esther Dakin-Poole: We're all good people. And that's really quite special to be part of an industry and an ecosystem that

144
00:25:13.050 --> 00:25:20.910
Esther Dakin-Poole: You can go to a conference and pretty much everybody's a happy smiley person that really wants to do good every morning and and so

145
00:25:21.390 --> 00:25:33.810
Esther Dakin-Poole: You're kind of on the right foot before we've even started because we all want to fix problems because we're all aware of them. But the system is so massive that we know that we need to come together to talk about them.

146
00:25:34.860 --> 00:25:39.510
Esther Dakin-Poole: But we're not starting from nothing. You know, we, there is, as you said, a

147
00:25:40.200 --> 00:25:50.850
Esther Dakin-Poole: Large tranche of aid. There's always been out there trying to help those countries who have who are less fortunate will find themselves in it less fortunate position at the moment.

148
00:25:51.780 --> 00:25:55.890
Esther Dakin-Poole: Their systems may have been stronger in the past and now they need building

149
00:25:56.850 --> 00:26:04.500
Esther Dakin-Poole: Our own system now is, you know, it's quite resilient, we're doing okay but covert is knocking. A lot of us for six and

150
00:26:05.010 --> 00:26:16.590
Esther Dakin-Poole: You know, I think that this data will help us rebuild afterwards and and help with, as you said, help those people that get out of bed every morning to do a good job to help people.

151
00:26:16.950 --> 00:26:22.410
Esther Dakin-Poole: Like us when we get older, like my sister, your daughter, you know, these

152
00:26:23.040 --> 00:26:43.680
Esther Dakin-Poole: These are the important things that we can fix. And it's not because they're kind of optional. The things we really need. And if we can embolden that system. We can also improve jobs. We can create industry, the industry itself can develop and produce who much self sufficiency. It's incredible.

153
00:26:45.210 --> 00:26:54.660
Esther Dakin-Poole: And people faces people to people. So it's not just tech for tech sake. And I think that's a really important thing that this data will actually embolden

154
00:26:55.110 --> 00:27:09.480
Esther Dakin-Poole: Human Interaction and show. Hopefully that the human touch that human need to help one another and a human need to communicate with one another is actually the core of all of this.

155
00:27:10.020 --> 00:27:19.800
Esther Dakin-Poole: And that by doing that will actually increase productivity for the individual and for the country's themselves because we're investing reinvesting in ourselves. We're investing

156
00:27:20.520 --> 00:27:31.080
Esther Dakin-Poole: In our individuals to help them live a more full and productive life to get out there again and not as we are with covert stuck at home, not being able to do anything or interact or spend any money.

157
00:27:32.160 --> 00:27:32.370
Debra Ruh: Right.

158
00:27:33.030 --> 00:27:35.760
Debra Ruh: Hundred percent agree on that. Sorry.

159
00:27:35.820 --> 00:27:43.200
Neil Milliken: You guys. Yeah. So, so the one of my big things is around the fact that

160
00:27:44.580 --> 00:27:56.580
Neil Milliken: Accessibility assistive tech enables people to be economically active and that is important for our global economy. So, so, as you said, it's an investment but it's, it's an investment with a return

161
00:27:57.300 --> 00:28:06.210
Neil Milliken: And that's the difference. It's, it's not like you're you're throwing good money after bad here this is this is money that is going to support the economies, it's going to support

162
00:28:07.350 --> 00:28:19.050
Neil Milliken: Countries to be able to rebuild. If we look at all of that age related disability stuff that you were talking about. We have a super aged population.

163
00:28:19.800 --> 00:28:24.690
Neil Milliken: For the next several decades, our global economy is going to be getting

164
00:28:25.650 --> 00:28:32.700
Neil Milliken: Progressively older, it will it will eventually that will change because the demographics will will hit a point where

165
00:28:33.090 --> 00:28:44.460
Neil Milliken: Where the generations die out and and and the younger generations will start to expand proportionately again. But right now we're in a situation where if we don't

166
00:28:45.300 --> 00:29:03.720
Neil Milliken: Build the capabilities and capacity to enable people to support themselves. We're going to be in a very difficult situation as society. So the work that you're doing is tremendously important. And then on the other point that you were talking about, which is

167
00:29:04.920 --> 00:29:06.960
Neil Milliken: It's kind of ironic because I am

168
00:29:08.430 --> 00:29:23.040
Neil Milliken: A horrible person for detail right I'm ADHD. Dyslexia to detail bores me senseless. I hate spreadsheets, but I recognize the absolutely the real importance of

169
00:29:23.430 --> 00:29:34.560
Neil Milliken: The classification piece because again with classifications, you can start sorting the data, you can start doing stuff with other technologies so was

170
00:29:35.070 --> 00:29:48.180
Neil Milliken: So was Antonio was talking about people developing new technologies. Well, they're going to fit in to one of those classifications. And so then if you've got these ISO classifications. There are going to be

171
00:29:48.720 --> 00:30:01.920
Neil Milliken: Systems and ISO standards that understand how you integrate these because while some tech is like crutches and glasses is not some. If they're independent, they're not they're not high tech

172
00:30:02.820 --> 00:30:11.670
Neil Milliken: Lots of stuff that will be putting on the list has interdependency with other technologies. So if we have classifications, then we can make sure

173
00:30:12.120 --> 00:30:29.520
Neil Milliken: That those classifications are understood when we're designing mainstream technology so that we can support these other things. So, so I totally get the the work that you're doing. I love the fact that someone else is going to do all the details stuff for me because we

174
00:30:30.720 --> 00:30:31.200
Neil Milliken: Thank

175
00:30:33.690 --> 00:30:35.760
Esther Dakin-Poole: You, your, your point about the

176
00:30:36.960 --> 00:30:47.100
Esther Dakin-Poole: interrelationship between the items is is key. And actually, I think if you think of the list as being the seed. It's the very beginning and and although

177
00:30:48.090 --> 00:30:55.890
Esther Dakin-Poole: Very frustratingly focus for many of us actually we've been campaigning for years about actually it's meant to be person centric not product centric.

178
00:30:57.300 --> 00:31:09.930
Esther Dakin-Poole: This having had discussions with people, particularly in the industry around social care where, again, they've been fighting for years for it not to be product centered, but actually to be person centered

179
00:31:11.220 --> 00:31:20.820
Esther Dakin-Poole: They've recognized as I have, having understood that the whole spectrum of it that this is about identifying, for example.

180
00:31:21.720 --> 00:31:33.570
Esther Dakin-Poole: A WHEELCHAIR and saying, fine. Well, we acknowledge that these are really useful products and that or crutches, for example, at some point, you know, a lot of people use these we fall over. We break our legs.

181
00:31:34.950 --> 00:31:42.990
Esther Dakin-Poole: We be in a lot of trouble without crutches or wheelchairs, for example, they're nice, simple, easy tick boxes. Once you know we know that they're good things.

182
00:31:43.590 --> 00:31:55.650
Esther Dakin-Poole: But actually it's so many other people are connected to that, not just the manufacturers have the end product, you have the developers, you have this, the people who are checking on the standards of it.

183
00:31:56.670 --> 00:31:59.220
Esther Dakin-Poole: You also have the physio therapists who are using it.

184
00:31:59.250 --> 00:32:01.740
Esther Dakin-Poole: You may have the occupational therapists using it.

185
00:32:02.550 --> 00:32:11.940
Esther Dakin-Poole: The environmental standards of which, if you don't have the right rounds and then accessible home, you can't use them your if your streets aren't paid correctly. So one

186
00:32:12.270 --> 00:32:24.450
Esther Dakin-Poole: Tiny little tick box to say that this is an essential item starts a proliferation of conversations. And if you can imagine that conversation is started around 100 items.

187
00:32:25.200 --> 00:32:41.340
Esther Dakin-Poole: I think that's why we've tried to keep it to a smaller number because actually your conversations, then intersperse so as each one, as you said, from a digital perspective, then Starbursts outdoor whole sector of other people and industries and professionalized sectors.

188
00:32:42.420 --> 00:32:51.630
Esther Dakin-Poole: It's it's a start of a conversation. So by bringing all of these people in a stakeholder group together as an ecosystem. We can have one heck of a conversation

189
00:32:53.340 --> 00:33:02.610
Esther Dakin-Poole: And the idea is that, then all of this data from these initial conversations for the first API would then go into them a national at report.

190
00:33:03.780 --> 00:33:13.680
Esther Dakin-Poole: Written by the stakeholders and contributed by the stakeholders with their data with their case studies to say what's great what's not great.

191
00:33:14.160 --> 00:33:21.120
Esther Dakin-Poole: And then we can also make suggestions. And the idea is then that that is then repeated with the AAPL every couple of years.

192
00:33:21.720 --> 00:33:31.620
Esther Dakin-Poole: So that as those discussions change be very enriched and their research and hopefully we have the buy in from government as well.

193
00:33:32.100 --> 00:33:41.400
Esther Dakin-Poole: But this then becomes something which can that is then as other countries have done is then carried out by every country, hopefully around the world.

194
00:33:42.300 --> 00:33:50.370
Esther Dakin-Poole: And then if we can come together again to revisit the AAPL and a global report a second one.

195
00:33:50.880 --> 00:34:00.600
Esther Dakin-Poole: We're going to really ramp up that data. We're going to really know how important it is in our lives. And I think where we'll be in 10 years time.

196
00:34:01.110 --> 00:34:13.410
Esther Dakin-Poole: By the end of the you know the STG 2030 I think at will suddenly become a very big ticket item on on the on the next three assessment at the end of that time and

197
00:34:14.550 --> 00:34:28.410
Esther Dakin-Poole: I think the STDs have no mention of assistive technology. They're part of so many of them that you can touch upon 810 411 I can't remember off the top of my head, but

198
00:34:29.580 --> 00:34:40.740
Esther Dakin-Poole: That they are there access to, you know, equity inequalities and education. They are very much and good health, but they're not just in one pocket that all over

199
00:34:41.280 --> 00:34:55.680
Esther Dakin-Poole: Because there's they're spread, they are at is everywhere. And I think if if we can use this data, we can then show how ubiquitous, it is and how essential it is true at all of those different sectors and all the STDs.

200
00:34:56.940 --> 00:35:00.570
Esther Dakin-Poole: But we can't do without the data. So that's why we need the help to collect it.

201
00:35:01.710 --> 00:35:07.950
Antonio Santos: As something that is pretty much at the heart of access chat and why myself who started this conversation is about

202
00:35:08.400 --> 00:35:16.320
Antonio Santos: Know, talking about these topics with everyone. So you were mentioning before that people that work in this sector, they are happy faces.

203
00:35:16.710 --> 00:35:31.110
Antonio Santos: Wake up in the morning that wants to do good, but how we engage with others who are not in the system tech. How can we go for them to come happy faces and if I'm talking about the procurement professionals.

204
00:35:32.130 --> 00:35:37.320
Antonio Santos: People that work in in cities designing Blanding, how can we make them happy faces and

205
00:35:38.700 --> 00:35:47.790
Esther Dakin-Poole: That's a lovely idea. I have a picture happy faces everywhere. But I think part of the message is that at is for all of us.

206
00:35:49.200 --> 00:35:56.880
Esther Dakin-Poole: I normally wear glasses as well. But, I'm, I'm kind of tired of wearing glasses after six months of staring at the screen. So you're a bit of a blur. But

207
00:35:58.230 --> 00:36:04.800
Esther Dakin-Poole: At least like I you know I normally use them. Once you see that at is actually for everyone.

208
00:36:06.000 --> 00:36:10.500
Esther Dakin-Poole: Then, and they see what the benefit is to themselves as an individual.

209
00:36:12.060 --> 00:36:27.360
Esther Dakin-Poole: If you need glasses and sudden your glasses are taken away or you fall down and there's there's nobody to fix you. There's no cost for your leg. There's no, there are no crutches for you. There's no grab bar for your granny or your parents.

210
00:36:28.500 --> 00:36:36.720
Esther Dakin-Poole: All the essential items that you need. There's no emergency Lord for your mother that lives in a in a care home if these devices. Suddenly, aren't there.

211
00:36:37.350 --> 00:36:48.180
Esther Dakin-Poole: Then they people start to realize how useful they are and how we essentially are. And I think part of the message of these broader programs is to normalize at

212
00:36:48.750 --> 00:36:59.910
Esther Dakin-Poole: Just to see this is something that we all use. It's not just used by an odd few who we label as disabled. It's the same as the whole disability conversation.

213
00:37:00.750 --> 00:37:09.750
Esther Dakin-Poole: It's not just for the labels. Because if your label. Then you get access to the tech and hopefully by doing all of these, these projects.

214
00:37:10.410 --> 00:37:18.780
Esther Dakin-Poole: None of us will be labeled with anything. It's at is just something that is used by most of us in some form or other

215
00:37:19.770 --> 00:37:30.990
Esther Dakin-Poole: And therefore it's not a special item anymore. It's just a day to day item, and so they should be happy because this is just something that we can't live without. We didn't know what it was.

216
00:37:32.100 --> 00:37:42.120
Esther Dakin-Poole: Because it was a term that was new and it was novel, because we didn't really have that many computers now everybody knows what it is at is the new it

217
00:37:43.260 --> 00:37:53.010
Esther Dakin-Poole: It's, it's something that will be with us forever, and it should be and it should be normalized. And the same way that once we've

218
00:37:53.640 --> 00:38:02.970
Esther Dakin-Poole: Got rid of there being some people who can't do as much as other people and we have to put a label on it. The term disability hopefully won't be there either. It's

219
00:38:03.450 --> 00:38:16.770
Esther Dakin-Poole: As at changes. So will the spectrum of how we turn the use of these things and everyone will procure them because it's just logical it's it's not unusual anymore. It's normative. Yes.

220
00:38:16.830 --> 00:38:18.480
Neil Milliken: That's, that's, that's great. And

221
00:38:19.980 --> 00:38:26.430
Neil Milliken: Some of the technologies that I use are mainstream technologies. And one of the things I've been saying for

222
00:38:27.450 --> 00:38:29.100
Neil Milliken: decades now. Now I feel

223
00:38:30.210 --> 00:38:45.330
Neil Milliken: Is, is the fact that actually all good technology is designed to be assistive is there to help you to achieve something that that was either more difficult or impossible. And so all this is is on a spectrum of

224
00:38:46.800 --> 00:38:52.650
Neil Milliken: Difficulty of problem to solve. So, so yeah absolutely with you. It should be just

225
00:38:53.430 --> 00:39:02.400
Neil Milliken: You know, part of normal life. So we've reached the end of our time, unfortunately. It's been fascinating talking with you and and love the work that you're doing need to thank

226
00:39:03.090 --> 00:39:18.000
Neil Milliken: Barclays access my clear text and micro link for helping keeping us on air and keeping us fed and watered and generally showing love to the community. Thank you once again Esther, it's been a real pleasure and we look forward you joining us on Twitter on Tuesday.

227
00:39:18.720 --> 00:39:21.330
Esther Dakin-Poole: Thank you very much indeed. Bless you. Bye bye.