Inverse Cooking: Recipe Generation from Food Images release_vjygbc4vxje3ja754g6wqnaqay

by Amaia Salvador, Michal Drozdzal, Xavier Giro-i-Nieto, Adriana Romero

Released as a article .

2018  

Abstract

People enjoy food photography because they appreciate food. Behind each meal there is a story described in a complex recipe and, unfortunately, by simply looking at a food image we do not have access to its preparation process. Therefore, in this paper we introduce an inverse cooking system that recreates cooking recipes given food images. Our system predicts ingredients as sets by means of a novel architecture, modeling their dependencies without imposing any order, and then generates cooking instructions by attending to both image and its inferred ingredients simultaneously. We extensively evaluate the whole system on the large-scale Recipe1M dataset and show that (1) we improve performance w.r.t. previous baselines for ingredient prediction; (2) we are able to obtain high quality recipes by leveraging both image and ingredients; (3) our system is able to produce more compelling recipes than retrieval-based approaches according to human judgment.
In text/plain format

Archived Files and Locations

application/pdf  10.5 MB
file_znxfrste4vdcrm7pbb3gh5kkya
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2018-12-14
Version   v1
Language   en ?
arXiv  1812.06164v1
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: 5836f251-366b-40e7-b336-35663366ed30
API URL: JSON