NVIDIA RTX 6000 Ada Lovelace announced for creative designers

The company promises two to four times better performance than the previous generation

This week, in addition to introducing the new GPUfocuses on company games – GeForce RTX 40 series – A NVIDIA also announced NVIDIA RTX 6000Focused on workstations and based on the company’s new architecture, ada lovelace. According to the brand, the new model is able to provide a Two to four times better performance than the previous generation, the RTX A6000.

The plate shows 48GB GDDR6 GPU memory with ECC, and has features such as Third generation RT colors – with the ability to simultaneously activate Ray Tracing and shading capabilities, for example -, fourth generation of Tensor Cores and CUDA Cores with up to twice the output compared to the previous generation. God RTX 6000 Compliant with virtual GPU (vGPU) software.

Other specifications are 4x DisplayPort 1.4; Maximum power consumption of 300W and PCI Express bus 4.0 x16.

A new generation in the professional sector

God RTX 6000 GPU gives NVIDIA Also comes to update the company’s plates that serve the segment of professional work positions. The company notes that the new video card architecture ada lovelace Has AI and shader capabilities capable of offering what you need designers and engineers who need real-time processing or scientists and researchers who use powerful computers to develop drugs or procedures, for example.

“NVIDIA’s professional GPUs helped us deliver an unparalleled experience for baseball fans everywhere, bringing baseball legends to life with AI-powered facial animation”Michael Davis says, Senior Vice President of Field Operations at Fox Sports.

God NVIDIA informs that RTX 6000 Will be available from global distribution partners and OEMs starting in December.

without me!  See prices for the new NVIDIA RTX 40 GPUs in Brazil

without me! See prices for the new NVIDIA RTX 40 GPUs in Brazil
The new generation graphics cards should reach national stores in 2022


Save on Black Friday

Source: NVIDIA

Leave a Comment

Your email address will not be published. Required fields are marked *