Case Study
Healthcare

Predictive responsiveness in neurological recovery therapy

Highlights

Bitstrapped collaborated with an innovative healthcare company specializing in neurological recovery therapy, who leverages motion technology to help licensed therapists build customized treatment plans for patients who have suffered a stroke. The goal is to prove that targeted therapy works better when it relies on predictive analysis and validated data to help dictate a patient’s recovery post-stroke and to be able to confidently prescribe a personalized rehabilitation plan.  


With extensive experimentation and collaboration with their medical teams, Bitstrapped ML architects were able to come up with the best model given the use cases; to assist the therapists design personalized therapy plans that would predict patient improvement with high confidence and the right guidance to make informed decisions.

Success Metrics

  • Reduced the experimentation cost by 30% 
  • Improved the process by eliminating redundancies and allowing for flexibility when training Machine Learning models 
  • Automated data warehousing, cleaning, preparation, processing, feature engineering, training & deployment using services Kubeflow (on GKE), Dataflow & BigQuery

Industry
Healthcare
Headquarters
Toronto, ON
TECHNOLOGIES USED

Challenge

Our main task was to assist healthcare professionals design specialized treatment plans to accurately monitor patient trends and to improve patient outcomes. With raw data being generated from the InMotion™ Robotic arm - as it detects patients’ movement throughout their treatment - and patient-specific metadata produced as a result, all data had to be benchmarked to then track patient progress overtime. 

With the data available, we had to draw on correlations between the duration of training, type of training and patient improvements in order to yield valuable insights for healthcare experts to run evaluations and make informed decisions when designing effective plans.

Solution

The solution centers around patient data workloads which are elevated by superior speed, scale, automation, and AI capabilities, a part of Google Cloud’s data and analytics foundations. By leveraging a readily available dataset, we were able to get up and running with data ingestion, ETL processes, and trained ML pipelines.

We began by building the data warehouse for all of their data collection, followed by data migration and storing effectively. 

We then started to ingest the raw data from the Robotic arm as well as the unfiltered application data and migrated it all to the data warehouse. Due to the volume and real time nature of data, we were able to leverage Google Cloud Dataflow for parallel processing and BigQuery for the actual warehousing. 

Meanwhile the Robot technology and core application ran on separate cloud services, which meant that we had to build out a hybrid cloud approach with two considerations in mind; 

  • First is to structure the architecture in a way that leverages Google Cloud AI and Machine Learning capabilities, and 
  • Second, to ensure a secure Hybrid cloud strategy was in place with emphasis on secure networking dealing with sensitive health data in hybrid systems.  

Once data was in the data warehouse we kicked off the experimental phase at two levels; first with the incoming data and second, with Vertex.ai WorkBench to experiment with the given data,  incoming data, as well as feature engineering techniques most suitable for the business use case. To get there, we consulted with their team of medical experts to gain domain knowledge to enhance the feature engineering. 

Given the elastic compute capabilities, and the capacity to run both small statistical models and larger GPU-intensive models, Vertex.ai Workbench allowed us to track all metrics, monitor patient trends and run self-hosted solutions.

Finally, we migrated the feature engineering and data pre-processing steps into Dataflow pipelines, integrated actual model training to work with Vertex.ai training, and then orchestrated the end-to-end process using Kubeflow pipelines, from data validation to model deployment. Kubeflow allowed us to break down the processes into different steps independently, meaning you can change any feature engineering step, while maintaining the data cleaning pipelines or change the model training steps without it affecting the feature engineering process. This service-style of deployment, orchestrating as smaller units of work, lets teams work autonomously and allows data scientists and data engineers focus on the parts that they’re good at. 

Kubeflow was also set up with appropriate production experiment segmentation so the team can conduct AB testing and experimentation in the future to deploy and train multiple versions of the production models without any interruptions.

Results

Today, there is a great need in the healthcare industry to shift towards a predictive approach to patient care and to enhance the overall patient experience.  The utilization of relevant data implies running large models in the cloud, reducing the cost of ownership and generating consistent valuable insights for the business. This allows healthcare professionals working closely with patients to transform the data into future insights, tailored to each patient.

The ML Foundations built by Bitstrapped will inevitably lead to the progress and knowledge of mobility, rehabilitation and the ability to grow and integrate these datasets with robotic device sensors in various settings such as hospitals, outpatient care centers and directly from patient’s homes. These outcomes will positively impact the patient journey, end-to-end. 

Through the successful integration of data analytics to evaluate patients’ therapy and assisting recovery routines for patients, clinicians and hospital staff can gain confidence in using the data in flexible ways. The ability to collect data in real-time from the robotic devices, to implement cloud storage, and to run analyses enables facilities to provide insights to perform better, streamline the coaching and training of staff, and make informed decisions.


See more case studies

Predictive Maintenance for Oil and Gas Supermajors

Cloud-based simulations to predict failure of equipment and improve the efficiency of maintenance operations

Read Case Study

Event-Driven Kubernetes pipelines for high performant at-home patient health monitoring systems

Reducing patient falls — a billion dollar cost to the healthcare system, can be combatted with help of Kubernetes Engine

Read Case Study