Gastronomy Flagship ProjectBuilding a GUI prototype
for an AI-powered recipe-creation app

Creating the future of food

Founded with the vision of creating AI that unleashes human imagination and creativity,”
Sony AI is branching out from Sony's existing business areas of gaming and imaging & sensing to
explore the world of gastronomy—and the Gastronomy Flagship Project is leading the charge
to foster imagination in a new field. The team is hard at work on a new AI-powered recipe-creation app
that helps chefs tap into their creativity and dream up new recipes, and they recently showcased
a prototype of the app's graphical user interface (GUI) at Madrid Fusión 2020,
the world's premiere gastronomy showcase event. We sat down with
Sony AI COO Michael Spranger and Design Producer Tatsushi Nashida for a deep dive
into the GUI design and the ingredients of inspiration inside.

(R to L) Sony AI COO Michael Spranger, Aroma Specialist/Sommelier* François Chartier, Chartier Lab collaborator Nicolas Roché, Sony AI Director Masahiro Fujita,
Sony Group Corp. Creative Center Design Producer Tatsushi Nashida

* François Chartier was named Best Sommelier in The World (Grand Prix Sopexa 1994 Paris)” and is one of the special advisors for the Gastronomy Flagship Project.

Exploring culinary frontiers
through new recipes

Could you give us an overview of the Gastronomy Flagship Project and the AI-powered recipe-creation app you’re building?

MichaelThe Gastronomy Flagship Project got its whole start with the idea that we could bring new value to life if we apply Sony’s AI and robotics expertise in the food sector. In working toward that basic vision, we’ve set two main goals. One is to develop a robot that can help out in kitchen settings, giving chefs a new tool to use for making great-tasting food. The other is to develop an AI app for chefs who like to push boundaries and expand their culinary palettes into tastes that people have never experienced before, all with an eye on health and environmental sustainability. We’ve been doing research and development on both fronts ever since Sony AI got rolling in 2020, seeing as how they align perfectly with the company’s identity.

NashidaI joined the project in 2018, when Michael’s team asked me if I’d be interested in creating a video that’d give people from both inside and outside the Sony organization an intuitive, visual grasp of how AI and robotics could shape the kitchens of the future. After getting on board, I helped develop the “AI×ROBOTICS×COOKING” concept movie with my colleagues at the Creative Center. The end result highlighted the idea of robotic arms redefining cooking methods and kitchen equipment, but the process of creating the footage also got the team thinking about new possibilities for utilizing the technologies. I remember talking with Michael about how we would need to research tastes and recipes if we wanted to get AI and robotics to cook autonomously. That conversation was where the AI-powered recipe-creation app got its start.

The AI×ROBOTICS×COOKING” concept movie

MichaelWith the app, we’re trying to leverage AI to create a source of new recipes” and help chefs harness the full power of their creativity. AI is a powerful means of mining data for characteristic features and using those findings to make projections. So how does that work in terms of food? Well, think of ingredientsas an array of data points with a global reach and loads of information, ranging from taste and aroma to molecular structures. If we could create an app to analyze all that data together and come up with new pairings and culinary approaches, we’d be able to give chefs a new source of inspiration.

A big opportunity eventually came our way: we got an invitation to showcase the project at the 2020 edition of Madrid Fusión, a gastronomical mecca of sorts for the world’s top chefs. To start turning our concept for an AI-driven recipe app into something tangible, I asked Nashida-san to design a GUI prototype for ingredient pairings. I told him I was looking for an interface to capture Sony AI’s vision for the future of food. We needed a foundation, to start from the ground up. It was a pretty abstract place to start from. I was asking him to think about what a new recipe is and then go from there.

Nashida:So, I had my work cut out for me. How could I interpret what a “new recipe” is? How could I get that essence to come out in an interface? Well, I started by thinking about traditional approaches. Recipes have always been about recreating something—a means of reproducing a dish. But that got me wondering. Is that what recipes have to be? I started understanding that maybe we could make recipes to create rather than just recreate, to give life to dishes and tastes that no one’s ever tried before. I started seeing our recipe-creation app as a supportive, AI-driven muse for creative-minded chefs. It could suggest fresh ingredient pairings, control potential Chef Assisting Cooking Robots in the kitchens of the future, and offer a wealth of other assistance. In my mind, the app wasn’t just going to be a tool on a device. I saw it as a member of the kitchen brigade, a real, active contributor. I told Michael about what I was hoping for the app to be, and he gave me the go-ahead.

With that overarching vision taking shape, I got to work on the design concept for the GUI prototype. The first thing I did was some research. Looking at how existing interfaces presented ingredient pairings, I found that the majority used inorganic charts and formulas—styles that might seem a bit unfamiliar to the average chef. I visited some kitchens, too, and found that the creative process was often collaborative. Instead of the chef directing everything on his or her own, there were lots of places where the whole kitchen staff pitched in with culinary ideas. Based on that input, I decided to build the GUI concept around three core elements: straightforward, intuitive language to make sure that every chef could connect with the content; taste visualizations to reach chefs and their teams on a more emotional level; and dialogical support to facilitate communication.

Putting chefs on
a new creative level

Madrid Fusión is a forum for the culinary world and by the culinary world: chefs and people from the food industry gather to listen to other chefs and experts talk about their work. How did the Sony team go about designing the GUI prototype to resonate with that audience?

MichaelOur biggest focal points for the Madrid Fusión presentation were highlighting ideas for innovative pairings that would grab the chefs in the audience, first of all, and designing a GUI prototype to give those ideas as strong a pull as possible. Working with François Chartier, our advisor and a world authority on aromatic science, we came up with ideas for menu items you wouldn’t normally find on a menu—sea urchin meets cheese meets chocolate, for example. For the GUI design, we decided to let Nashida-san go where his inspiration took him; there weren’t any specific directions from our end.

NashidaBeing at Madrid Fusión was more than just about showcasing our design—it also served as an opportunity to show people in the gastronomy community that Sony was serious about making its mark in the field. We wanted the GUI to present the pairing ideas in a way that would both be easy for chefs to connect with and also go in a whole new direction, design-wise. AI-driven interfaces in the food sector tend to plot hundreds and hundreds of ingredients on a plane with webs of lines linking good ingredient matches. The problem with that approach, though, is that you end up with a messy labyrinth of lines. For chefs, it’s not an easy presentation to navigate. I knew that was something we wanted to avoid.

Instead of a planar surface, I went with a spacelike, three-dimensional setting with various ingredients floating around in the background, the target ingredient (the centerpiece of the chef’s dish idea) sitting right in the middle, and suggestions for good pairings hovering in the vicinity. It came together as a pretty simple design, a touch of futuristic flavor filling out the aesthetic. The design also blurs ingredients that wouldn’t go as well with the target ingredient into the background, which helps give a visual cue to the infinite range of potential pairings—including ones that might not yet be visible. To reach the chefs at Madrid Fusión, I also knew that we had to make the ingredient images pop. I picked out colorful, vibrant pictures for every ingredient and processed the visuals with a kind of plastic-wrap filter so that they’d exude a fresh, tasty vibe.

The GUI prototype

NashidaThe big question was how to make it clear that two given ingredients strike a good match together. I started out using lines to connect ingredients with high pairing compatibility, but the presentation came out looking too complicated every time. I was scratching my head about what to do when I came across something that Mr. Chartier had said: every ingredient pairing is greater than the sum of its parts... 1 + 1 = 3!” When you put ingredient A and ingredient B together, you might get a taste you never expected, something totally new. That dynamic harmony, for Chartier, is what pairings are all about. Aiming to render a sense of culinary synergy in my GUI design, I made it so that matching ingredients would join together, expand, and take on new hues to visualize how new tastes are born—the element that makes the pairing process so special.

MichaelI really loved how Nashida-san used organic movements to capture the genesis” of new tastes. It was almost like a window into the mind of a chef in the kitchen, the ideas for different ingredient combinations melding and synergizing into creative sparks. The emotional pull was definitely there, and that’s exactly what we were looking for.

NashidaIn addition to capturing the creative process through the visual content, I also focused on making the operations as intuitive as possible for chefs. When it came to actually designing the controls, though, Michael and I had different ideas. He wanted the prototype to have experimental, future-oriented interactions, but I was wary about possibly going too far; I wanted to make sure the chefs knew that they’d be getting something surefire, something practical, so I proposed a tablet interface. Chefs wouldn’t have problems getting their hands on tablets or mastering the basic operations, after all. I also wanted users to start touching the interface as soon as they had it in their hands. That’s why I designed the ingredients to sway slightly on screen. When you see that movement, you know you’re supposed to use your fingers.

I also put a lot of effort into the audio interface because we wanted the app to be able to dialogue” with the chef and the rest of the kitchen staff as a fellow team member. Nobody likes interacting with someone who can only give mechanical responses, so I focused on making things natural and conversational. When the app suggests a pairing, it might say, “We don’t have this ingredient around here, but we could use something else sourced locally.” It all plays into the interactivity element, providing a helpful, responsive sounding board to help chefs unleash their creativity.

Creating the future of food with chefs
from around the world

How did Sony AI’s presentation at Madrid Fusión 2020 go over? What lies in store for the Gastronomy Flagship Project?

MichaelDuring the presentation, I saw chefs and people from the food industry in the audience pull out their devices and start recording the video footage of our GUI prototype. It was clear to me that the presentation went over really well. As a first step for the app, the event was a success—but what we presented was just a prototype. We’ll have to keep the content and the design evolving so that the app can give chefs better, more concrete suggestions. Food is an incredible experience, whether you’re making it or eating it. We want to transform and amplify the value of that experience, and we’re going to keep our R&D activities on that track.

NashidaThings are already starting to evolve, and a lot of that progress comes from the conversations we’ve had with the chefs we met at Madrid Fusión—they’re giving us ideas about what they want to be able to do with AI. We’re really looking forward to that kind of collaboration. If we want the app to be a contributing member of the kitchen brigade, pitching in with advice on ingredient pairings and controlling Chef Assisting Cooking Robots, we have to embody that team spirit and to incorporate input from chefs, companies, and whatever other sources we can find. That’ll help us keep improving the design as AI technology evolves.

Creating dazzling new tastes isn’t the be-all, end-all, either. We want to use our resources to make an impact in other areas, too, like improving health through better dietary habits, addressing potential food shortages in the future, and finding solutions to other sustainability issues. We’re just getting started—and we can’t wait to keep shaping the future of food.

Sony AI’s recipe-creation app needed a tantalizing GUI, one that would whet users’ creative appetites
for innovative ingredient pairings. The Creative Center was thrilled to be a part of the effort.
As the Gastronomy Flagship Project continues to explore the possibilities of AI-driven cuisine,
we at the Creative Center are excited to be part of the effort to create a new future for food.