technology chief Kevin Scott said on Wednesday that the company is having an easier time getting access to 's chips that run artificial intelligence workloads than it was a few months ago.

Speaking on stage at the Code Conference in Dana Point, California, Scott said the market for Nvidia's graphics processing units is opening up a little. The GPUs have been heavily in demand since Microsoft-backed OpenAI launched the ChatGPT chatbot late last year.

"Demand was far exceeding the supply of GPU capacity that the whole ecosystem could produce," Scott told The Verge's Nilay Patel. "That is resolving. It's still tight, but it's getting better every week, and we've got more good news ahead of us than bad on that front, which is great."

Microsoft, similar to and other tech companies, has been quickly adding generative AI to its own products and selling the technology's capabilities to clients. Training and deploying the underlying AI models has mainly relied on Nvidia's GPUs, creating scarcity of supply.

Nvidia that it expects revenue growth this quarter of 170% from a year earlier. The company has such control of the AI chip market that its gross margin shot up from 44% to 70% in a year. Nvidia's is up 190% in 2023, far outpacing every other member of the S&P 500.

In an with Patel that was published in May, Scott said one of his responsibilities is controlling the GPU budget across Microsoft. He called it "a terrible job" that has been "miserable for five years now."

"It's easier now than when we talked last time," Scott said Wednesday. At that time, generative AI technologies were still new and attracting broad attention from the public, he said.

The increased supply "makes my job of adjudicating these very gnarly conflicts less terrible," he said.

Nvidia expects to increase supply each quarter through next year, finance chief Colette Kress told analysts on last month's earnings call.

Traffic to ChatGPT has declined month over month for three consecutive months, Similarweb said in a . Microsoft provides Azure cloud-computing services to OpenAI. Meanwhile, Microsoft is planning to start selling access to its Microsoft 365 Copilot to large organizations with subscriptions to its productivity software .

Scott declined to address the accuracy of regarding Microsoft's development of a custom AI chip, but he did highlight the company's in-house silicon work. Microsoft has previously worked with on an Arm-based chip for Surface PCs.

"I'm not confirming anything, but I will say that we've got a pretty substantial silicon investment that we've had for years," Scott said. "And the thing that we will do is we'll make sure that we're making the best choices for how we build these systems, using whatever options we have available. And the best option that's been available during the last handful of years has been Nvidia."

WATCH: