EDGE AI POD

Revolutionizing Education and Healthcare with Cutting-Edge Human-AI Interaction

EDGE AI FOUNDATION

Discover how the fusion of cutting-edge technology and human interaction is revolutionizing education and healthcare with Professor Stella Kuei-An Nguyen from National Yangming Chow Tung University. Join us as we explore her innovative work in Human-AI Interaction using Inertial Measurement Units (IMU), and witness the transition of Raboni IMU devices into the transformative Bony AI. Collaborating with giants like TSMC, MIT, and leading hospitals, Professor Nguyen showcases how these advancements are not only elevating coding education but also offering groundbreaking solutions for medical conditions such as Parkinson’s and dementia. These innovations are powered by tiny machine learning algorithms, seamlessly blending sensors and AI processors into user-friendly interfaces that redefine efficiency in educational and healthcare diagnostics.

Delve into the world of advanced motion monitoring technology as we delve into the integration of Rabonit, IMU, and the YSI2 AI processor, designed to intelligently track movements. This sophisticated system promises to revolutionize the way we assess physical activity, especially for children in intermediate playing stages. With a specially designed app that extracts and analyzes movement curves, this technology offers invaluable insights that could enhance both educational environments and healthcare sectors. Join us for an insightful journey into how these tools are set to make a broader impact, aiming to enrich lives by providing effective and efficient information.

Send us a text

Support the show

Learn more about the EDGE AI FOUNDATION - edgeaifoundation.org

Speaker 1:

I'm welcoming our next guest, professor Stella Kuei-An Nguyen from National Yangming Chow Tung University, and thank you for joining us. Look forward to the talk, thank you. Thank you, chair, and ladies and gentlemen, it's my great pleasure to be here to share with you the topic you can find I will talk about the HAI with WE2 and the integration with the IMU Inertia Measurement Unit and the use of cases on education and healthcare. You can find that we have a big team. There are seven students of mine, actually, they are doing integrated circuit design, and also we have Dr Liu from Veteran General Hospital, ruoming Zong Hospital, and then we have Dr Zhang, dr Zhen and Dr Yan from Chang Geng Mastic Hospital, chang Geng Hospital, and also I would like to present the special thanks to Hymex for their long-term technical support of this big team.

Speaker 1:

Ok, here's the abstract. Actually, I have three things to share. Firstly, I have to share the beauty to integrate IMU, raboni IMU devices. Raboni is the Bible usage for master and the teacher. It's given by our former president, zhangmaozhong Xiaozhang, and it's equipped with the IMU sensor manufactured by TSMC. And also we integrated the AI processor from WiMAX and with the integrating, we are now from Raboni to Bony AI, and so I have this here. So it's it's Raboni. It's Raboni the PLC device. The PLC device and maybe it's bigger than the iWatch, but it's good enough and cute enough for kids' education and also for health care is, with Y size integration and the IMU, the size was dominated by the Bluetooth module and the SD module, the memory, and so we can find this good for education and healthcare.

Speaker 1:

Sorry, I have to be back. Okay, first thing, I don't think it's so friendly to me. Okay, the second thing I will share with you the HAI devices working for education actually is a collaboration with TSMC SDG program and with MIT the Scratch Lab. And also the third thing I will share with you the collaboration with medical doctors, and we call this physical tiny machine learning algorithm because you know W students, they required to learn DSP and so I think to combine with machine learning algorithm and the DSP it can really do good job for any kind of recognition. And so with doctors, they just hope to detect the long time detection for degeneration with that kind of diseases like Parkinson's, spinal disease, seizure, hunchback and dementia, and so I do hope to.

Speaker 1:

I do appreciate the integration with IMU and iChip Because you know, formerly we have bunch of sensor data, go for the IMU and then we have to transmit through Bluetooth and to the mobile devices. It's very power consumptive and now we can have also the bunch of the sensing data, but all the recognition or processing can be done with the HAI device. We call it Bony AI for short Raboni to Bony AI and the outcome? The outcome only the things needed to be transmitted. That may be a couple of bytes only, so it's very, very efficient for usage For education. I have only one slide because we have hundreds of user cases of the video and Actually we can have one should, because we I need things help. You can find that okay, it's too big. And you can find that we can go With age device. With Raboni and we're with bony AI we can teach not only the coding but also environmental protection. We now use it to teach kindergarten to grade 12. That is, we teach kindergarten students about semiconductor and AI. That's required by TSMC, their SDG program. Actually, I have to mention that tomorrow we will have a late evening meeting with MIT to talk about the book publish. It is sponsored by TSMC. They would like to promote such kind of usage with AIoT and combined with the education I learned from our Institute of Education, the professors. They told me that such kind of devices is excellent to promote active learning and interactive learning, and I think that's maybe a kind of usage for you to share.

Speaker 1:

And secondly, we just use Bony AI to cover medical doctors and to check all these devices I have just mentioned and for Parkinson's and for spinal disease, for seizures, for hunchback, for sarcopenia and for dementia sarcomenia jisouzhen and dementia shizhi I think I'm still a stranger to these medical vocabularies and what we are doing is that we detect the walk, jump, arm movement, hand movement and the spin movement. Even for now, most of the doctors they have to observe the disease or such kind of disorders. They all have to do that with eye inspection, and so the collaborative work we are working with doctors is like this Firstly, doctors have to tell us what kind of movement they inspect, what kind of movement to be observed, and then we provide the sensor and also our students collect the data and after that we do the tiny machine learning detection for classify I will talk about that later and also we use dsp algorithm to facilitate or to assist DSP algorithm to facilitate or to assist the accuracy or some of the judgment, and then we use ViCon, the high-speed camera, to evaluate the algorithms. And then we have to outsource, because our students are not good at APP design. The key is the user interface and the user experience, because it has been taken to hospitals for neurologics and for the patient to use, and that will be the key. And after the data being analyzed, what's the disorder or what's the extent or what's the diagnosis or the treatment will be decided by doctors. So from the top to the end is the doctor's work and our tour men, tour lady can only do the intermediate works.

Speaker 1:

And for the following I will have all the examples we had done. Firstly, there are Tianlong Babu, and I think it is a kung fu drama. I think all the people in Chinese community knows about that. There are eight fancy kung fu and to beat bad guys, and we call this just because doctors from Chang'an Neurosurgery Department, director of neurosurgery, shenjin Waike Zuren, and he told me, they told us that there are four type of walk, four type of a jump, and he can recognize the disorder or degeneration of spinal and so he records the correct movement and we collect the data and then what we do and we have to recognize the step length, step height or frequency, such as things, and doctors can do the analysis and then he can get the score for the patient and, okay, we have to have things. So this is the real app we developed and we just combined the AI devices to patient with IRB, sure for sure, and so you can find, we collect the data, this is what the AG AI devices do and the doctors will analyze the data and give the score for them. I think doctors like this, because formerly they have to. You know, they have to observe the phenomenon, maybe before and after the operation, or before the three months ago, what the patient do and the three months after what the patients do, so this can provide precise medicine for their diagnosis.

Speaker 1:

Frankly speaking, with 30 years of chip design career, I think I can do no more contributions to humanity in this work. I got only five minutes, the following slide, I will quickly browse and we can find what the tiny machine do. Also, we collect the data, the four types of jumps, and we do the model analysis. Firstly, we can have 72 parameters and because my students have to do design, so they will actually reduce the number of features and to facilitate their chip design work and the accuracy came up to 98%, also for the work, and also here we have the parameter reduced and the accuracy reduced, but with DSP supporting, the accuracy can be improved, especially for Parkinson's disease.

Speaker 1:

Okay, I really hope to have this video for you. You can find this lady and this is a doctor and the lady with Parkinson's disease and it is a doctor and the lady with Parkinson's disease and it is a day after the cascaded days and doctor stopped her medicine and also turned off the deep brain stimulator in her brain. You can find his walking or his movement style. So that's why we use AIoT HAI machines for patients. So I really think it's really contributed to human beings and it's the first day, just one day different, totally different kind of walking movement. So I think we can provide precise data to doctors and the doctors can do the precise treatment for patient and we have 18 of the PD patients and, just as I said before, for hand movement for any movement.

Speaker 1:

Now we change the features with the DSP supporting. So the features is not is not anymore the a strapping tear or the gyroscope, but like the step length or the step length or step height with DSP supported and you can find. We can use KNN KNN to detect the PD patient and the normal control is done by doctors, and also for hand gesture and then for arm movement and then for scissors. We also use the Bony AI to have the correlation with the EEG data. So now patients don't have to be hospitalized and he can just detect the seizure at home. So it's from Chang'e University and it's Hunchback. It's for Hunchback Also because we maybe all have the experience with big hospital for x-ray. To take x-ray, they have massive works, and so now doctors can take the curve of the back just with the AIoT, just with the HAI devices, and the patient don't have to do such kind of massive work just to take the x-ray. And so doctors also like it and we can have the bonus video.

Speaker 1:

I think I still have some time, thank you. And it's for kids he playing our intermediate, and so we can extract the curve, the movement of the kits, by the APP design. Okay, how to get back? Okay, finally comes to the conclusion Can you walk out? Okay, I will say that, with integration with Rabonit, imu and the YSI2 AI processor, okay, and we can smartly monitor all the movements and all around. Okay, I need your help, I think, and we can provide the most efficient and effective information and which will bring better life. Okay, I think all the application is not only for education and for health care. We can do anything about the movement, with IMU and the AI process integrated. So it's a good experience for me and hope to share with all of you. No-transcript.