Nvidia is estimated to be making a substantial profit, potentially around 823%, for each H100 GPU accelerator it sells. The estimated cost for each H100 chip is approximately $3,320, considering components and manufacturing costs. Nvidia's street-price of $25,000 to $30,000 for each HPC accelerator more than covers this cost. However, this pricing also needs to account for product development and R&D costs, which involve significant resources and a team of specialized workers. Nvidia's average salary for an electronics hardware engineer is around $202,000 per year, and the development of chips such as the H100 likely requires thousands of hours from multiple specialized workers.

Despite the substantial expenses involved, Nvidia's position as a leader in AI acceleration workloads has led to high demand for its GPUs, with orders reportedly sold out until 2024. The expected growth of the AI accelerator market to $150 billion by 2027 further indicates a promising future for Nvidia. However, this rapid growth in AI servers has impacted DDR5 manufacturers, causing a revision in their market penetration expectations, with DDR5 adoption expected to reach parity with DDR4 by 3Q 2024.

While Nvidia is reaping the benefits of its investments in AI infrastructure and product stack, there are considerations about the potential opportunity cost for other players in the industry. The high prices of AI acceleration could limit investments in other areas or constrain the pursuit of less-safe research and development ventures.

Nvidia's strong bottom line from its AI chip sales reflects the company's success in this market, with potential for substantial growth. CEO Jensen Huang likely sees this as a positive development for the company.