Joint Embedding of Food Photographs and Blood Glucose for Improved Calorie Estimation


Lida Zhang; Sicong Huang; Anurag Das; Edmund Do; Namino Glantz; Wendy Bevier; Rony Santiago; David Kerr; Ricardo Gutierrez-Osuna; Bobak Jack Mortazavi

Abstract


Type 2 diabetes has a significant impact on individuals' health and well-being, and diet monitoring is an important tool in treating individuals. Accurate estimation of meal intake is essential for promoting diet and behavior interventions. While continuous glucose models (CGMs) have demonstrated the ability to estimate carbohydrate quantities in meals, CGMs alone have been insufficient in capturing other meal nutritional information due to the different types of food and people's health conditions. Therefore, we propose a multi-modality model for augmenting CGM-based inverse metabolic models by using both CGM captured interstitial glucose data and food image data. A late fusion approach is used to aggregate the extracted glucose information from the attention-based Transformer and Gaussian area under the curve (gAUC) features, and image information from the vision transformer. We test our model on a dataset collecting Freestyle Libre Pro CGM data and meal photographs of breakfasts and lunches on 27 participants, with meals with known fixed caloric content. Our joint embedded approach to learning calorie estimations from both CGM data and image data achieved an average Normalized Root Mean Squared Error (NRMSE) of 0.34 for calorie prediction, with a correlation of 0.52, a 15.0 % improvement over CGM only models and 17.1 % over image only models.

Keywords: Machine learning; Continuous glucose monitors; Diabetes; Diet monitoring; Nutrition

Links

[Full text PDF][Bibtex]