|
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
|
| Volume 187 - Issue 93 |
| Published: March 2026 |
| Authors: Andrei Carl L. Castro, Nathan Sheary G. Muñoz, Neo Jezer A. Pare, Joey S. Aviles |
10.5120/ijca2026926609
|
Andrei Carl L. Castro, Nathan Sheary G. Muñoz, Neo Jezer A. Pare, Joey S. Aviles . NutriSnap: Mobile-Based Food Recognition with Caloric and Macronutrient Estimation using MobileNetv2 and YOLOv8n. International Journal of Computer Applications. 187, 93 (March 2026), 31-37. DOI=10.5120/ijca2026926609
@article{ 10.5120/ijca2026926609,
author = { Andrei Carl L. Castro,Nathan Sheary G. Muñoz,Neo Jezer A. Pare,Joey S. Aviles },
title = { NutriSnap: Mobile-Based Food Recognition with Caloric and Macronutrient Estimation using MobileNetv2 and YOLOv8n },
journal = { International Journal of Computer Applications },
year = { 2026 },
volume = { 187 },
number = { 93 },
pages = { 31-37 },
doi = { 10.5120/ijca2026926609 },
publisher = { Foundation of Computer Science (FCS), NY, USA }
}
%0 Journal Article
%D 2026
%A Andrei Carl L. Castro
%A Nathan Sheary G. Muñoz
%A Neo Jezer A. Pare
%A Joey S. Aviles
%T NutriSnap: Mobile-Based Food Recognition with Caloric and Macronutrient Estimation using MobileNetv2 and YOLOv8n%T
%J International Journal of Computer Applications
%V 187
%N 93
%P 31-37
%R 10.5120/ijca2026926609
%I Foundation of Computer Science (FCS), NY, USA
This paper presents NutriSnap, a mobile-based food recognition system that estimates caloric and macronutrient content from user-captured images. The system integrates MobileNetV2 for image classification and YOLOv8n for object detection in a modular two-stage pipeline. Trained on the Food-101, UEC-256, Food2K, and a custom Filipino food dataset (Phil23), the MobileNetV2 model achieved a Top-1 validation accuracy of 73.19% across 189 food categories, with a macro-averaged F1-score of 0.73 and a weighted F1 of 0.73. The YOLOv8n model, trained using a three-stage fine-tuning approach with synthetic data augmentation, attained 96.1% precision, 92.9% recall, and 97.3% mAP50 on the validation set. Both models were converted to TensorFlow Lite (TFLite) and integrated into a Flutter-based Android application. Nutritional values are retrieved from the USDA FoodData Central and Philippine Food Composition Tables (PhilFCT) databases using a proportion-based formula keyed to user-entered serving weight. System usability was evaluated using the System Usability Scale (SUS) with 68 participants, yielding a mean score of 80.62, categorized as "Excellent." Comprehensive experimental results, including training convergence curves, per-class performance analysis, stage-wise detection metrics, and comparative evaluation against related works, demonstrate that the integrated pipeline is effective for real-time dietary monitoring on resource-constrained mobile devices.