|
Jensen Huang's GTC 2026 keynote wasn't just about new chips. It showed Nvidia pushing to own the economics of inference, agentic AI, and the infrastructure beneath the next industrial wave.
The post At GTC 2026, Jensen Huang Shows How Nvidia Plans to Run the ‘Full AI Stack' appeared first on eWEEK.
| RELATED ARTICLES | | |
|
At GTC 2026, NVIDIA and Bolt announced what they hope will be a symbiotic partnership. Bolt gets NVIDIA technology that would be costly and impractical to build on its own. Meanwhile, NVIDIA not only gains a major customer but also access to the European rideshare company's driving data.
Bolt says its fleet data will build a "learning engine" for autonomous vehicles (AVs) using NVIDIA tech. The rideshare company will use NVIDIA Cosmos to curate and search driving data. It will tap into NVIDIA Omniverse to reconstruct digital twins of real-world driving logs, then use Cosmos again to generate and augment data at scale.
NVIDIA's Alpamayo model, designed
|
|
Just months after announcing DLSS 4.5 at CES, NVIDIA has unveiled its next major upscaling technology, DLSS 5. The company is doubling-down on AI for this next iteration, claiming DLSS 5 "infuses pixels with photoreal lighting and materials" using a real-time neural rendering model when it arrives this fall.
So what does this mean in practice? In an on-stage demo at NVIDIA's GTC 2026 keynote, CEO Jensen Huang showed off the technology with Resident Evil: Requiem, Hogwarts Legacy and Starfield. DLSS 5 adds a noticeable amount of detail to character's hair and skin tone, but it also appears it's being compared to those games without any DLSS features turned on. It's unclear how much of a difference it makes compared to DLSS 4.5 with path tracing and all of its features turned on.
"DLSS 5 takes a game's color and motion vectors for each frame as input, and uses an AI model to infuse the scene with photoreal lighting and materials that are anchored to source 3D content and consistent from frame to frame," NVIDIA said in a blog post. The company also notes that the technology runs in real time, and it works at up to 4K.
Huang showed off DLSS 5 while running a system with two RTX 5090 GPUs. Eventually, it will be able to run on a single video card (though I'd imagine it would have to be almost as powerful as two 5090s). Huang also paints DLSS 5 as a step towards offering Hollywood-like quality for real-time rendering, without the need for the GPU horsepower required by studios. It sounds a bit like a generative AI video model that can be directly controlled by developers, instead
|
|