Mi opinión, tremendo bicho, pero que en ac valhalla empate a 1080p y en 2k solo saque 20fps mas vs 3090ti... Auguro que Las 4080 lo van a pasar mal...
Porque son mas destinadas a esas resoluciones, que a 4k.
Que la 4090 con raytracing duplique el rendimiento (30vs60 fps) de la 3090ti en cyberpunk si dlss ni historias... Es una bestialidad.
La conclusión con más cabeza por parte de un medio, que he leido en mucho tiempo (guru3d):
From a technological point of view, the GeForce RTX 4090 is a bit of a masterpiece and an enigma. It feels bizarre to talk about products that consume 425~450 Watts (nearly half a Kilowatt per hour of gaming) in times when people are concerned about heating their homes in the winter. Of course, when NVIDIA was developing this GPU, times were different as there was no war on the European border. When we look at performance per joule of energy, NVIDIA advanced bigtime though, so ADA architecture has a lot of potential to be energy friendly. My message to NVIDIA is simple: make an energy-efficient statement, and design a product that offers excellent gaming horsepower for as little energy as needed. For those who live in different parts of the globe, here in the EU, energy prices are closing in at 50-75 cents per KWh, in some parts of the content, even 95 cents per kWh. Enough said about that, though.