NVIDIA CEO Jensen Huang Leaves Everyone SPEECHLESS (Supercut)

Ticker Symbol: YOU16 minutes read

Computing costs have drastically decreased, aiming to reduce costs to nearly zero for large language models to access all digital human knowledge. Nvidia is advancing GPU chip technology, with plans for further developments to increase deep learning capabilities by a million times in the next decade.

Insights

  • Computing costs and deep learning costs have drastically reduced in the last decade, aiming to bring the marginal cost of computing close to zero for large language models to access vast digital human knowledge.
  • The introduction of GPU chips like the H100, with plans for more advanced versions, is revolutionizing artificial intelligence by condensing data center capabilities into a single computer, paving the way for significant advancements in deep learning computational power over the next ten years.

Get key ideas from YouTube videos. It’s free

Recent questions

  • How has computing costs evolved in the past decade?

    Computing costs have decreased by 1 million times in the last 10 years.

  • What is the goal of reducing the marginal cost of computing?

    The goal is to enable large language models to extract all digital human knowledge from the internet.

  • What is the significance of the H100 chip in artificial intelligence?

    The H100 chip replaces old CPUs in data centers and increases deep learning computational capability by a million times.

  • How does continuous learning impact the training and inference processes?

    Continuous learning merges training and inference processes into a seamless loop.

  • What is Nvidia's dominance in the inference market attributed to?

    Nvidia's accelerated computing platform is widely compatible and runs everywhere.

Related videos

Summary

00:00

"Computing Costs Reduced, AI Advancements Accelerate"

  • In the last 10 years, computing costs have been reduced by 1 million times, and deep learning costs have also decreased by 1 million times.
  • The goal is to reduce the marginal cost of computing to nearly zero to enable large language models to extract all digital human knowledge from the internet.
  • This breakthrough in computing has enabled a new way of software development, allowing computers to understand the meaning of digital knowledge.
  • Gene sequencing is digitizing genes, but with large language models, the meaning of genes and cells can be understood.
  • The GPU chip behind artificial intelligence is the H100, with plans to introduce the H200 and eventually the H700 by March 2029.
  • The H100 chip weighs 70 lbs, consists of 35,000 parts, and replaces a data center of old CPUs into one computer.
  • The chip works at the chip, algorithm, and data center levels, computing at data center scales and increasing deep learning computational capability by another million times in the next 10 years.
  • Continuous learning will merge training and inference processes, creating a seamless loop of learning and application.
  • Nvidia dominates the inference market with its accelerated computing platform, which is widely compatible and runs everywhere.
  • The goal is to drive the marginal cost of computing down to zero, leading to new ways of computation and increased computational capabilities.

16:07

Tailored solutions for complex customer needs

  • Customized solutions for customers of a certain scale are considered due to the complexity of the platform, involving multiple chips, networking components, and software, with customization requiring significant R&D investment.
  • While customization is possible, it must leverage existing elements to avoid resetting and squandering the substantial investment made in the platform, with openness to developing proprietary security, confidential computing, or new processing methods.
Channel avatarChannel avatarChannel avatarChannel avatarChannel avatar

Try it yourself — It’s free.