Eating Smart: Advancing Health Informatics with the Grounding DINO-based Dietary Assistant App
Abstract
The Smart Dietary Assistant project combines technology and Machine Learning (ML) to offer personalized advice for people with dietary concerns such as diabetes. This approach focuses on the user helping them make decisions about their diet using the Grounding DINO model. Grounding DINO uses a text encoder and image backbone to improve detection accuracy without relying on a labeled dataset making it practical for real world situations with various food types. This model uses a 52.5 AP score on the COCO dataset and attention mechanisms that leverage features based on user-provided labels and food images to allow precise object recognition. The feature is at the core of the user app, turning smartphones into a helpful dietary advisor that enables people to manage their health effectively.
The app can use your device camera to take photos that will be analyzed by the model for detection and categorize the food items correctly. This is what differs in this system: it decides to be free and not to be connected to annoying cloud databases of information. The application uses a database managed by itself that is of PostgreSQL type, ensuring the preservation of data integrity and control. This database hosting information includes all types of food products, from profiles to health insights drawn from their consumption by human beings. This helps in effective and efficient data access speed, reliability, and enhances user privacy through localized storage within the organizational infrastructure.
The app focuses on improving the experiences of the users, considering that it allows them to create profiles through which they describe themselves based on preferences and tips on nutrition. In addition to calories information, the app provides insights to nutrients such as proteins, vitamins, and minerals. This makes it possible for one to decide the kind of food to take, either for weight management, muscle building, or managing health conditions. On the other part, it also assesses food compatibility versus profiles and gives personal recommendations for alternatives and recipes. Such kind of personal help is highly convenient for persons with needs as it helps them take their healthy options confidently.
Developed using React Native and TypeScript, the Smart Dietary Assistant app guarantees operation across devices and platforms. It incorporates technologies beyond modeling to ensure optimal performance in food recognition, scalability for future enhancements and seamless integration, with other dietary tools. Users have the option to enjoy features like using the camera to scan food items, for tracking habits and receiving insightful analysis. They can also interact with an assistant for recommendations. The protection of data is ensured through user authentication whereas customizable settings enhance the user experience. React Native enables smooth screen transitions. The expo camera allows scanning capabilities. Local storage efficiently manages data to create an easy/appealing to use interface.
The Smart Dietary Assistant app’s interface stands out for striking a balance between aesthetics and usability. The use of buttons, and a vibrant color scheme enhances user experience by making navigation and feature selection simple. The chatbot feature, represented by an avatar encourages user engagement and personalized guidance seeking. Users find camera scanning convenient although it is noted that varying lighting conditions may affect accuracy. It is this appreciation that opened doors to improvement that can guarantee success in all situations.
The choice of a self-hosted PostgreSQL database for this project re-emphasizes its importance in the realms of health informatics and nutritional science. This is data that can be stored without really depending on outside cloud services, and just with that, the same can be retained as reliable information, since there are chances that it can be changed from the outside.
In the future, the Smart Dietary Assistant is planned to be empowered with collaboration with devices. With this development, the application can sync with fitness trackers and smartwatches to give time-based suggestions from physiological data such as blood sugar level and calories burnt. This will connect users to devices that give them individualized advice regarding their health needs, depending on the style of activity. The application is open to collaborations with AI-powered tools in the development of personalized recipes and meal plans that would give the user an easy time adhering to his preferences, dietary restrictions, and time-in sync physiological information. With conditions like diabetes, this holistic approach to diet management is deemed beneficial because it would make the app utilities more effective, always supports objectives for weight management or muscle building, and therefore supports the overall well-being of the user.
Key words: Food Image Recognition, Machine Learning in Nutrition, Zero-Shot Object Detection.
Received Date: April 07, 2024 Accepted Date: May 09, 2024
Published Date: June 01, 2024
Available Online at: https://www.ijsrisjournal.com/index.php/ojsfiles/article/view/153
DOI: https://doi.org/10.5281/zenodo.11243881
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.