AI Algorithm Tells You the Ingredients in Your Meal Based on a Picture
Your food photography habit could soon be good for more than just updating your Instagram. As Gizmodo reports, a new AI algorithm is trained to analyze food photos and match them with a list of ingredients and recipes.
The tool was developed by researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL). To build it, they compiled information from sites like All Recipes and Food.com into a database dubbed Recipe1M, according to their paper. With more than a million annotated recipes at its disposal, a neural network then sifted through each one, learning about which ingredients are associated with which types of images along the way.
The result is Pic2Recipe, an algorithm that can deduce key details about a food item just by looking at its picture. Show it a picture of a cookie, for example, and it will tell you it likely contains sugar, butter, eggs, and flour. It will also recommend recipes for something similar pulled from the Recipe1M database.
Pic2Recipe is still a work in progress. While it has had success with simple recipes, more complicated items—like smoothies or sushi rolls, for example—seem to confuse the system. Overall, it suggests recipes with an accuracy rate of about 65 percent.
Researchers see their creation being used as a recipe search engine or as a tool for situations where nutritional information is lacking. “If you know what ingredients went into a dish but not the amount, you can take a photo, enter the ingredients, and run the model to find a similar recipe with known quantities, and then use that information to approximate your own meal,” lead author Nick Hynes told MIT News.
Before taking the project any further, the team plans to present its work at the Computer Vision and Pattern Recognition Conference in Honolulu later this month.