Can AI Ever Crack Taste?

Image may contain Julianne Michelle Face Head Person Photography Portrait and Adult
Photo: Shahram Saadat

This article is part of the Future of AI, a collectsion of articles that investigates how artificial intelligence will impact the fashion and beauty industries in the years to come.

Earlier this year, tech moguls and CEOs began rhapsodizing about the importance of taste.

Y Combinator co-founder Paul Graham predicted in February: “In the AI age, taste will become even more important. When anyone can make anything, the big differentiator is what you choose to make.” The same month, OpenAI president Greg Brockman espoused that “taste is a new core skill”.

In fashion (and a plethora of other industries, creative or otherwise) the notion that taste has not always been imperative is laughable. The concept of taste, though, has been amplified and contorted in the age of generative AI, the much-interrogated and debated concept of aesthetic judgement fast emerging as a 2026 buzzword. “Every company right now wants to talk about taste. Every thought leader in tech wants to write a Substack about taste,” says Andy McCune, founder of Cosmos, a visual inspiration platform.

It’s a thought-leadership exercise in showing that AI executives are not detached from the primarily human asset of good taste. Yet, if taste and personal style are innately human, cultivated through engagements with various cultural outputs — books, film, people on the street — could AI ever understand a user’s personal style, or cultivate its own sense of taste? It’s a pressing question for the fashion industry, where these very instincts are at play at every level, from clothing design to outfit and product recommendations (many of which are already coming from AI platforms and tools).

Some working in tech are entirely confident that it can be done. “I hate to break this to everyone, but you probably don’t have better taste than the AI,” one head of product wrote on X. “There’s a good chance AI will have better ideas than us within a few years,” an AI CEO quipped.

Those not siloed to the world of tech, however, are less convinced. “Taste and personal style is something you develop with time and with real-life experience,” says trend forecaster Mandy Lee. “Having no touchpoints to the real world is the antithesis of building personal taste. So whatever they’re talking about is not the same thing as taste and style.”

Shoppers themselves are unconvinced, too. At present, only 3% of shoppers surveyed by Replica Handbag Store Business say they use AI chatbots to find fashion and style inspiration — versus 57% for magazines (both print and digital), followed by street style (47%), fashion blogs, Substack or Pinterest (36%), and influencers (35%).

Personal style has long been used to signal certain types of aspiration, to cultivate an individual personality that dictates how one fits into society, says Richard Thompson Ford, professor at Stanford Law School and author of Dress Codes: How the Laws of Fashion Made History. People do so by borrowing and combining references from all different aspects of life, from other communities to historical periods or social classes, using familiar images as reference points, whether it’s art and film or celebrities and influencers. Good taste isn’t merely copying, Thompson Ford says, but rather, “quoting small parts of a familiar ensemble and putting them together with other things in order to express something that, at least to them, is unique and individual”.

Peoples’ increasing inclination to turn to AI for discovery could well rework the way in which they develop their own sense of style. Fashion-tech startups are bullish on AI tech’s ability to cut out the mundane and remove the friction. AI shopping platform Daydream wants to do just this. Its usership isn’t “fashion with a capital F”, says Lisa Yamner, co-founder and chief brands officer. “The people who are finding us are quite needs-based; it’s more fashion enthusiast than ‘show me Loewe’s recent runway’ kind of stuff.”

Instagram content

With almost half of survey respondents saying that their biggest challenges when it comes to styling themselves are putting outfits together from pieces they already own, and finding styles within their budget (both 45%), it’s no wonder over a third of users (36%) would consider using an AI tool to find out next season’s trends. But can AI crack taste and style to the point at which its curatorial abilities match those of a human?

Industry veterans are dubious. Lee, who has worked in trend forecasting and analysis for 10 years, says her work would suffer if she began using AI, because it lacks understanding of the cultural events and influences that shape the way we want to dress. “There’s no way at this point with technology that AI can really dig in and make sense of the cultural impacts that events, socioeconomic standings, finances, and world politics have on trends and fashion,” she says. “And when you strip away the aesthetics, that is what actually dictates trends. It’s not fashion, it’s everything around fashion.”

Can AI ever truly grasp these real-world phenomena?

Input limitations

AI’s biggest obstacle to offering up tasteful outputs, are its inputs. AI relies on data sets, and the data that generic AI engines are scraping is vast (think, the whole internet) and therefore noisy. Those that are purpose-built for fashion still face an uphill battle, because fashion-related data sets don’t tend to be up to par, says Yilu Zhou, associate professor and area chair at Fordham’s business school, who has been working at the intersection of fashion and AI since 2013.

Zhou’s early research made clear that fashion taxonomy is not well standardized. “Every designer speaks a different language. They can have two very similar designs, but they use totally different words to describe them — on purpose,” she says. The branding exercise embedded in product descriptions — think Haider Ackermann’s Tom Ford “glass effect” descriptor for a now sold-out $10,250 blazer made of clear plastic — is one of the biggest obstacles to AI successfully predicting and distilling fashion trends, Zhou says. Fashion companies standardizing their data is the first step toward “useful AI”, she says. “Otherwise, your AI will be [based] on biased data, so whatever the AI generates will be biased and will not make sense because your data is not right.”

The data upon which AI bases its outputs can be misleading, experts say — especially in fashion. “It can be something that’s quite sensational, that gets a lot of social media shares but doesn’t necessarily mean people are going to go out and buy it,” says Francesca Muston, chief forecasting officer at WGSN. Think 2023’s viral visible panties on the runway, or bras in 2025, both of which buyers (correctly) expected to generate more clicks than sales. It doesn’t typically account for seasonality either, unless programmed to do so, Zhou flags. “If a data analyst doesn’t understand fashion, he will give you the wrong interpretation: that the trend is gone,” she says. “But the trend is not gone; it’s coming back next year.”

This is why human judgement is necessary: to make sense of the stories AI often obscures. “Humans are able to better contextualize disparate pieces of information, make those connections, and understand that it’s setting up an opportunity for a trend further down the line,” Muston says. Zhou agrees, adding that human expertise is needed to discern when AI is either off the mark or hallucinating in order to complete a picture derived from an incomplete data set.

Image may contain Alek Wek Electronics Mobile Phone Phone Person Clothing Coat Face Head and Text

Daydream’s model is purpose-built for fashion, which its founders say give it a leg up on other generative AI tools.

Photo: Daydream

Purpose-built models strive to offer a better starting point, in order to be able to offer up products that align with whatever a user indicates their style to be. Daydream’s Yamner recalls conversations with users when she’s introducing them to the platform. “They’ll say, ‘Oh my god, I just tried that search on ChatGPT and I got horrible results,’” she says. AI that is trained on fashion should be able to offer up better suggestions.

“Daydream uses a proprietary brand mapping system to understand how brands relate to one another across style, aesthetic, and positioning,” Yamner explains. “Layered with personalized user signals, this allows us to recommend brands that feel both relevant and surprising — with no advertising or paid placement influencing what gets surfaced.”

Programming taste

Even if AI were to be prepped with the ‘right’ data, not everyone is convinced its output can match up to the taste levels achievable by humans. “If AI has the right training set, it could approximate in some circumstances what individuals do... but I think it’s always going to be a beat behind because as an individual in the world, your influences might come from the street, they might come from chance encounters, from a whole variety of sources — some of which are digitized and available to AI, and some of which aren’t,” he says. “I doubt that all of the influences that affect [taste], particularly for a stylish person who has a good aesthetic sense, are immediately available to an AI.”

AI also tends to pick up on generic trends; those dominating social feeds or shoppable magazine headlines. But what’s more interesting are the styles and aesthetics that develop at the local level, which is harder for AI models to pick up, Zhou says.

Programming in an attempt to match a user’s specific style can also risk getting overly granular and prescriptive, limiting room for discovery outside the aesthetics one would usually gravitate toward. “If you don’t have a thousand dollars to spend on a bag, we’re never going to show you the thousand-dollar bag,” Yamner says. But how many fashion kids have been influenced and shaped by Nicolas Ghesquière’s Balenciaga City bag of the early 2000s, or Phoebe Philo’s Celine, even if they couldn’t afford it? That still factors in, Yamner says. Though Daydream filters by price when it comes to purchase intent, the platform can use Philo’s aesthetic “as a signal to surface pieces that capture a similar vibe at a more accessible price point”, Yamner says, noting that this type of democratization is important to Daydream.

Image may contain Art Collage Adult Person Wedding Face Head Photography and Portrait

Cosmos suggests imagery that it thinks will align with a user’s taste level, determined by its “aesthetic prediction model”.

Photo: Courtesy of Cosmos

Similarly, programming solely based on fashion brands and trends is limiting, given how much personal style and taste is shaped by other areas of culture. WGSN’s trend predictions, for instance, have been sharper since the company started trackings other industries like food and sports, Muston says. “People don’t just wear clothes. They also eat food, live in a house, wear cosmetics, and engage with the other sources that we look at, like consumer tech or sports and outdoors,” she says. If all you focus on is the product itself, “you’ve missed a lot of the experience and what’s really driving that trend”.

There are founders, though, who believe that AI models can be developed in a way that enables the tech to aggregate and identify ‘good taste’ — as is dictated by the human input.

Cosmos’s McCune believes that, with the right programming, AI can learn taste. McCune’s goal for Cosmos is to be the ‘anti-slop platform’. “There’s a way to use AI to support creatives in places like search and recommendation,” he says. Cosmos’s machine learning team has built an “aesthetic prediction model” that dictates what users see on the app. It was trained on images saved by Cosmos’s first 10,000 beta users (a combination of designers, creative directors, interior designers, architects, and graphic designers), alongside “some really bad” data sets trained as a negative sample. Now, every image uploaded to Cosmos is scored against the aesthetic integrity determined by the positive and negative samples.

“We set a bottom threshold and anything that is under that threshold gets deprioritized within search and for you,” McCune says. He emphasizes that it’s not a homogenization of visual culture, as Cosmos doesn’t only surface content in the top end. “We’re more using it for a bottom threshold: to get the garbage and the slop out.” Machine learning, he says, has been a key piece of Cosmos’s curation. It is, on a small scale, an answer to Zhou’s gripe with fashion’s ‘bad data’. (Though Cosmos isn’t fashion-specific.)

Receipt-sharing app Selleb is similarly bullish on the potential of merging AI tech with human guidance. Co-founders Chloe and Claire Lee think of AI as a first layer of utility that will ultimately be vetted by humans. Users share receipts, not just for fashion (as the founders initially expected, but for cafés, modes of transport, flight tickets, groceries, and more). “Our big-picture vision is really to be able to map all of these different products on the internet, map everyone’s taste across a bunch of variables that get closer to that aspect of taste — which I still think is a very ineffable thing you can’t quite pinpoint,” says Claire.

Image may contain Rose Byrne Person Cutlery Fork Text Baby Art Collage Face and Head

Selleb is built on the importance of cross-category inputs to better identify someone’s taste, preferences and style.

Photo: Courtesy of Selleb

A new user connects their email and submits thousands of receipts. “Those receipts — when they happened, what categories, how much I paid — say a lot about who I am as a shopper and my DNA,” Chloe says. Users follow what the sisters call their “taste doppelgangers”: people with similar preferences across categories, based on the “taste graph” the app is essentially mapping. By looking at users’ receipts across the board, the AI on the backend is able to identify patterns it otherwise wouldn’t be able to based on data points universally available online — and surface fashion (and other) recommendations accordingly.

Looking back

AI predicts and identifies based on past data points. This means that, no matter how much the technology evolves, it does not have the capacity to look or think beyond these inputs. “AI isn’t great on novelty — and so much of trends is dependent on novelty,” WGSN’s Muston says.

In reality, peoples’ style and taste preferences change and evolve based on shifting contexts, when there are movements in culture that AI has no way of anticipating. “Trends are highly complex and they move in all sorts of different ways,” Muston says. “You can’t count the number of times where people have said to you, ‘Oh, I would never wear XYZ,’ and are vehemently opposed to that trend because of some historical association that they’ve had with it. Yet, when you see it coming down the line, suddenly reframed in a new context, it then becomes desirable.” If a look or an aesthetic seems doomed to fail based on past data, AI will take that at face value. Humans, on the other hand, can interrogate the context in which it’s happening — and why a comeback might be on the cards.

Peoples’ interests in certain brands or aesthetics is often piqued by completely random or statistically improbable events that AI is not able to account for or consider, says Madé Lapuerta, who runs @DataButMakeItFashion. She points to a surge in interest in Van Cleef & Arpels last November, when Dodgers player Miguel Rojas — who wasn’t even meant to be up to bat — made a game-changing home run, winning the world series. “Because AI-driven predictive models are completely dependent on patterns we’ve observed in the world before, they can’t look into the future or understand what will resonate and what won’t.”

This is Lee’s biggest gripe with the use of AI in work that involves predicting trends or anticipating how tastes will develop and change. “The way that AI quote-unquote predicts trends is not a prediction. It’s what’s currently happening now,” she says.

Image may contain Ariadna Cabrol Indoors Interior Design Couch Furniture Adult Person Bed Bedroom Room and Floor

A still from Cosmos’s recent short film, which McCune says illustrates the platform’s taste level, starring Odessa Odessa A’zion and directed by Aidan Cullen.

Photo: Daniel Vignal

The human edge

This human edge is key. AI’s reliance on historical data illustrates how AI can surface content based on the ‘what’, but not the ‘why’. “AI can codify taste, but in a synthetic way,” Muston says.

It’s too imitative, Thompson Ford agrees. “It’s one thing to say, ‘Hey, I want to look like Ralph Lauren’s collectsion from last year,’ and AI might be able to do that. But if I want to look like someone’s collectsion this year that they haven’t [created] yet, I’m skeptical that AI is going to be able to do the same thing that a designer does — or the same thing that a stylish individual does.”

Even techno-optimists like McCune are questioning such ideas. “The inherent nature of models is that you have to train [the models] on something — so it has to come from something in the past,” he says. “Whereas humans have the ability to look forward and create new things and trends and aesthetics, models will always be a reflection of the past — but I think that humans will be the only thing that are inherently able to truly look into the future.”

Instagram content

The only world in which experts think AI could feasibly imitate this is if it were to gain sentience — the possibility of which is a hot debate topic among AI experts — and even then, it’s not a sure thing. “I believe that [generative] AI will be able to cultivate taste and style, but I think it will be a taste and a style of the now or of the past,” McCune says. “It’s not going to be able to look into the future and create new things that feel on-taste or on-trend.”

Lee — less of an AI optimist — agrees that AI’s lack of ability to look forward is its limiting factor. But to her, this means AI (without sentience) will never be able to cultivate taste nor style. “You have to go outside, see what people are wearing, see what people are talking about, watch movies, listen to music, pay attention to current events and politics,” Lee says. “These are actually the things that shape fashion and taste and style. It’s not the actual clothes you’re wearing — it’s everything about you as a person. If you’re relying on fucking AI robots to tell you who you are, you’re never going to have style.”

Even if AI were, one day, to gain sentience, no longer bound by human input, there’s one thing it would still lack: a human body. And at the end of the day, without a body from which to operate, and put clothing onto, cultivating a sense of taste and style is near-futile. “The one thing AI doesn’t have is a body,” Thompson Ford says. “I find it hard to imagine that the kind of intuitions that come from someone who’s thinking about how they move through the world in their own body and interact with other people, that those intuitions could come to AI — except, again, in an imitated fashion.”

Lee agrees. “I’m sure it will improve, but AI will never be a human. So it’s impossible, I think, to make real sense of events in the world and how it will translate to fashion,” she says. “I’ve been doing this for 10 years, and sometimes even I’m wrong or not up to speed on certain things. And there’s no way a robot will ever be better at it than me.”