AI4All Day 12: Capstone project debugging and key insights, AI for Health Care

Nidhi Parthasarathy
3 min readAug 20, 2022

--

Nidhi Parthasarathy, Wednesday, July 13th, 2022

An interesting discovery

Today, we continued working on our projects.

Our accuracy was good but we wanted to improve our model’s precision and recall. So, we added a validation dataset in the training code. However, this didn’t end up changing our results.

After a long time debugging, and several meetings with my team members, we finally realized that our dataset may be the issue. In particular, because we had a lot more negative cases than positive, our dataset was unbalanced and that in turn skewed the precision and recall. Researching this later, I realized that this was a common problem to other ML problems too, and in particular medical datasets where there are insufficient people with the disease to have adequate positive samples for the model to train on.

Our afternoon session was canceled, so our group worked some more on our project and tried other approaches to increase the precision and recall a bit more.

A large difference in the number of total positives compared to total negatives

Contactless and Ethical Ambient Intelligence Transforming Care Delivery in Health Care Spaces

The last session of the day was by Zane Durante and Alan Luo, both Ph.D. students at Stanford. They talked about contactless and ethical ambient intelligence transforming care delivery in health care spaces. They also discussed the applications of their work in both hospital care (hospital space, ICU patients) and senior care (daily living space, frail seniors).

They started by emphasizing that medical errors were the 3rd leading cause of death. They argued that errors like these were always going to happen, and the importance of catching and preventing these mistakes to minimize their impact. AI can be used to reduce this medical error. They talked about how we could take the same technology in self-driving cars and apply it to healthcare to save more lives and how technology can transform care delivery. Their vision is “endowing health care spaces with ambient intelligence via smart sensors and ML algorithms”.

The first step to do this is to transform the health care space with sensing capabilities (RGB cameras, depth-based cameras, thermal cameras). The second step is to recognize clinically relevant areas (these are particularly hard to track). The third step is to integrate into the full clinical data ecosystem (sensing capabilities and recognizing actions with existing healthcare space). The last step is to intervene to improve care and reduce costs (help patients mobilize earlier, detect falls).

For senior care, they talked about how the senior population is rapidly growing and how the cost is becoming unsustainable. Analyzing the behavior of seniors after their discharge from hospitals provides a strong indication of how external assistance they would need when they lived alone. They use continued monitoring of activities of daily living (ADLs) with smart sensors to help with this.

This was a very inspiring talk, especially to see the application of AI technologies in changing people’s lives, particularly for seniors.

More Project Work :D

After the talk, we spent more time working on our projects. We worked late into the evening, trying out new ideas. It was really nice working with a team of others and learning more about code and machine learning, and learning about one another.

Read on for day 13 to see how things started coming together on our capstone project.

--

--

Nidhi Parthasarathy
Nidhi Parthasarathy

Written by Nidhi Parthasarathy

Highschooler from San Jose, CA passionate about STEM/technology and applications for societal good. Avid coder and dancer.