That said, starting in the early 1990s is missing the whole first half of the story, no? Searching Google Books with a 1980-1990 date range for things like "3d graphics" "art" or "3d graphics" "special effects" yields a lot of primary sources that indicate that creative applications were driving demand for chips and workstations that focused on graphics. For instance this is from a trade journal for TV producers in 1987: "Perhaps the greatest dilemma facing the industrial producer today is what to do about digital graphics... because special effects, 2d painting, and 3d animation all rely on basically the same kind of hardware, it should be possible to design a 'graphics computer' that can handle several different kinds of functions." [https://www.google.com/books/edition/E_ITV/0JRYAAAAYAAJ?hl=e...]
It's not hard to find more examples like this from the 1985-1989 period.
Of course graphics hardware was also used for more creative purposes including desktop publishing, special effects for TV, and digital art, so you will find some people in those communities vaguely wishing for something better, but artistic creation, even for commercial purpose, was never the market driver of 3D acceleration. Games were. The hardware was designed for gamers first, game programmers second, game artists a distant third, and for nobody else.
The closest thing to an "art computer" around that time was the Amiga which targeted the design/audio/video production markets.
Edit: this discussion is interesting because I have always just taken it for granted that video games are a form of art. Clearly others don't see it that way, which is fair! Nevertheless, I think a strong case can be made: https://en.wikipedia.org/wiki/Video_games_as_an_art_form
Gamers aren’t primarily spending time or money for the art and neither was NVIDIA. I will grant that the hardware improvements did make the visual aspects more lifelike and detailed and that allowed for increased artistic range, but production costs generally increased accordingly.