RESEARCH
RESEARCH
Research Grants: Ongoing
(Ongoing) XR환경 다중감각적 UI를 위한 어포던스 기반 사용자 경험요인 분석에 대한 연구
교육부, 한국연구재단(NRF), 신진연구지원사업(인문사회), 2024.06. ~ 2027.05., PI: Prof Miran Lee
(Ongoing) 고령층을 대상으로 한 생애사 프로그램 개발과 그 효능 검증 및 인공지능 생애사 프로그램의 융합적 기반 기술 개발
교육부, 한국연구재단(NRF), 글로벌 인문.사회융합연구 (연구그룹형-국내), 2024.06. ~ 2027.05.
Homepage: https://ailspchallenge.modoo.at/
(Ongoing) 효과적인 재활교육을 위한 근골격계 질환 환자모방로봇과 가상현실 훈련용 컨텐츠 개발(Development of Patient Robot with Musculoskeletal Symptoms and VR Training Program for Effective Rehabilitation Education)
과학기술정보통신부, 한국연구재단 (NRF), 생애첫연구 (RS-2023-00213191), 2023.03 ~ 2026.02, PI: Prof Miran Lee
(Ongoing) XR인터페이스 상호작용 개선을 위한 사용자 감정 클래스 디코딩 알고리즘 개발
대구대학교 2024학술연구비 (2024.05-2025.04), PI: Prof Miran Lee
The main purposes of the proposed method in this research are (a) to provide feedback that allows the caregiver to react immediately to the emotions felt by the robot during care training and (b) to propose a way for the caregivers and the robot to communicate their emotions, thereby enhancing the caregivers’ ancillary qualifications such as stability, optimism, and communication.
Project Investigator: Dr. Wei Qun (Clairaudience Company Limited.) & Department of Biomedical Engineering, School of Medicine, Keimyung University
Multimodal Technology-based Smart Stethoscope (2022~Present)
[Paper] / [BibTex]\
The proposed device is designed in the shape of a compact personal computer mouse for easy grasping and attachment to the surface of the chest using only one hand.
A digital microphone and photoplehysmogram sensor are installed on the bottom and top surfaces of the device, respectively, to measure heart sound and pulse from the user’s chest and finger simultaneously.
High-performance Blu]etooth Low Energy System-on-Chip ARM microprocessor is used for pre-processing of measured data and communi]cation with the smartphone.
A human patient simulator (HPS) can achieve visual-, auditory-, text-, and alarm-based effective feedback methods in care and nursing education. Among these, the method of visual feedback is essential to design an HPS that can express emotions or feelings of pain like an actual human does because this method allows an immediate reaction against a robot and a human. This study aims to develop a visual feedback method for a care training assistant robot that can express pain states in joint care education.
Development and Quantitative Assessment of an Elbow Joint Robot for Elderly Care Training (2018~2021)
[Project Page] / [Paper] / [BibTex]
Develop the upper limb of the care training robot
Develop the care training program for measuring quantitative data and evaluating the care training skills
Quantitative assessment of the care skills
Development of the Shoulder Complex for Care Training (2018~2021)
Develop the upper limb of the care training robot
Pain Expression-based Visual Feedback Method for Care Training Assistant Robot with Musculoskeletal Symptoms (2018~2021)
[Project Page] / [Paper] / [BibTex]
3D facial pain expression that simulates a patient with a musculoskeletal for improving the skills of workers in care education.
Prediction of pain intensity of robot based on Fuzzy-Logic method
Measurement of pain intensity from a facial image using a Siamese-Network to generate a robot's avatar
Integration of pain expression-based avatar and patient robot
R-Peak Detection Method for Mobile Environments (2016~2018)
Fiducial Points (P, QRS complex, T wave) Detection
ECG Signal analysis in mobile environments
Arrhythmia Detection (2018~Present)
Stress Management (2016~2018)
Health-care wearable sensor with behavior analysis (physical activity and stress state recognition)
Develop the stress measurement algorithm
Analysis of HRV (heart rate variability) based on ECG signal using ECG wearable sensor
Activity Recognition and Energy-Expenditure Estimation Using Wearable Sensors (2016~2018)
Health-care wearable sensor with behavior analysis (physical activity and stress state recognition)
EOG-based Eye Tracking Protocol (2014~2016)
Method to remove baseline drift and noise by using a differential electrooculography signal based on a fixation curve (DOSbFC) and a new electrode positioning scheme based on eyeglasses for user convenience.
Mobile application controlling a human-computer interface using an EOG signal
Desktop application controlling a human-computer interface using an EOG signal
Research Grants: Completed
(Completed) 재활 운동주체감 향상을 위한 근력 반영 VR 인터페이스 기반 콘텐츠 개발
대구테크노파크 (대구TP), 지역혁신중심 대학 리빙랩2.0 연구활동 지원, 2024년 지역혁심중심 대학연구활동 지원사업, 2024.05 ~ 2024.11, PI: Prof Miran Lee
(Completed) 산모 및 태아 건강 모니터링 위한 반지타입 생체신호 측정 장치 전용 온디바이스 AI 알고리즘 개발
3단계 산학연협력 선도대학 육성사업 (LINC 3.0) 산학공동기술개발과제, 2024.06~ 2024.11, PI: Prof Miran Lee
(Completed) HCI-AI 연구실
3단계 산학연협력 선도대학 육성사업 (LINC 3.0) 인증 기업지원연구실 운영 지원 사업, 2024.05~ 2024.11, PI: Prof Miran Lee
(Completed) 비대면 사회에서의 국민건강증진을 위한 가상현실 재활운동프로그램 개발 (2023.02-2024.01)
(재)동일문화장학재단 학술연구조성비 (Dongil Culture and Scholarship Foundation), PI: Prof Miran Lee
(Completed) ResNet기반 음성인식을 통한 사용자의 내재된 감정인식에 대한 연구
대구대학교 학문후속세대 연구과제, 2023.04 ~ 2024.03, PI: Minjeong Lee and Advisor: Prof Miran Lee
(Completed) 소아과 부족 문제 해결에 대응하기 위한 일상생활 건강 모니터링이 가능한 영유아 청진기 장난감 개발
대구TP, 지역혁신중심 대학연구활동 지원사업, 2023.06~ 2023.12, PI: Prof Miran Lee & Sungmin Lee (Master Candidate)
(Completed) 정밀 리프팅에 적용가능한 거북목 증후군 모니터링 알고리즘 개발
3단계 산학연협력 선도대학 육성사업 (LINC 3.0) 산학공동기술개발과제, 2023.06~ 2023.12, PI: Prof Miran Lee
(Completed) XR환경의 아바타 비서를 위한 음성 기반 감정인식 방법 개발
여대학원생 공학연구팀제 지원사업-심화과정 (WISET), 2023.04 ~ 2023.10
PI: Minjeong Lee and Advisor: Prof Miran Lee
(Completed) "일상생활에서의 스마트청진기를 위한 심음 분석 알고리즘 개발" (2022.09-2022.12), PI: Prof. Miran Lee
(대구-경북TP, 2022 지역기업-청년 희망이음 지원사업 기업애로해결 프로젝트)
(Completed) 가상현실에서의 사용자 내면 감정인식 디코딩 알고리즘
경북TP, 경북형 SW 인력양성사업, 2023.05~ 2023.10
PI: Prof Miran Lee & Team Leader: Suyoung Kim
(Completed) 돌봄능력증진훈련에서 사용되는 환자모방로봇 및 로봇 통증추론 모델에 대한 연구 (2022.06-2023.05)
대구대학교 교내학술연구 (Daegu University Research Grant)
This research was supported by Daegu University Research Grant, 2022-0219
PI: Prof Miran Lee
(Completed) 모바일 환경에서의 스마트청진기를 위한 심음 분석 기법에 대한 연구 (2022.09-2023.01), PI: Prof. Miran Lee
3단계 산학연협력 선도대학 육성사업 (LINC 3.0) 산학공동기술개발, 교육부&한국연구재단 (Leaders Industry-University Cooperation 3.0)
(Completed) Adaptable AI-enabled robots to create a vibrant society (2021.10~2022.01)
JST Moonshot Research and Development Program-Moonshot Goal 3 (PM: Prof. Hirata and Advisor: Prof. Wen)
(Completed) Elucidation of the relationship between pain and facial expression and its application to a joint rehabilitation education support robot (2020-2021)
JSPS KAKENHI Grant-in-Aid for Scientific Research (C) (PI: Prof. Joo-Ho Lee)
(Completed) Japanese government scholarship (2018.09-2021.09)
The Ministry of Education, Culture, Sports, Science, and Technology (MEXT) (Dr. Miran Lee)
(Completed) Development of dual-band type stress monitoring and management system for life guardians service (2016-2018)
National Research Foundation of Korea (NRF), (PI: Dr. Inchan Youn)
(Completed) Development of personalized bio-signals analysis techniques and algorithm for muscle stimulation (2016-2018)
National Research Foundation of Korea (NRF), (PI: Dr. Inchan Youn)
(Completed) Development of IT-based rehabilitation medical devices using biological signals (2014-2016)
National Research Foundation of Korea (NRF), (PI: Prof. Deok-Hwan Kim)
(Completed) Development of green biomimetic implantable electronic chip and U-bio information processing platform convergence technology (2014-2016)
Ministry of Science, ICT and Future Planning, (PI: Prof. Deokhwan Kim)