NVIDIA GeForce Gamescon 2018 keynote live blog

En el evento Gamescon 2018, 20 de Agosto, a las 12:10 NVIDIA realizó la presentación de las series de tarjetas gráficas Geforce RTX. Este es el blog en vivo de Anandtech.com. Si, está en inglés.
Varias muestras se han presentado ya desde Marzo.
Las entradas temporales estan desde las más recientes hacia las más antiguas. Es decir, hay que leerlo de abajo hacia arriba.
La final de texto están las imágenes, perdonen pero no iba a acomodarlas todas donde salen en el original. Las imágenes si están en orden cronológico de la presentacion.

Precios corregidos, versiones Founder y de Referencia
•GeForce RTX 2080 Ti Founders Edition: 1199
•GeForce RTX 2080 Ti Reference: 999
•GeForce RTX 2080 Founders Edition: 799
•GeForce RTX 2080 Reference: 699
•GeForce RTX 2070 Founders Edition: 599
•GeForce RTX 2070 Reference: 499




****************BLOG en VIVO***********************************************
02:08PM EDT – That’s a wrap. Thanks for joining us, everyone!
02:07PM EDT – But we still have no idea how it specifically performs in numbers…
02:06PM EDT – And that’s the end of that!
02:06PM EDT – And a little demo combining the dancing astronaut with the Quadro room scene
02:05PM EDT – Oh, they’re showing the astronaut demo again
02:04PM EDT – ‘Everything you see here in completely in real-time’
02:04PM EDT – And one more surprise. A real-time demo, sounds like
02:03PM EDT – Advancements in the RTX platform, their measurement metric with RTX OPS, and the 20-series that is launching a month from now
02:03PM EDT – Summarizing the announcements, now
02:02PM EDT – 2080 Ti is $1199 on the NVIDIA website, as well.
02:01PM EDT – A graph for RTX OPS. Not a great graph, though, because those OPS are very specfic
02:01PM EDT – ‘The RTX 2070 has higher performance than the Titan Xp’
02:00PM EDT – However*, on their website, the Founders Editions are $100 more
02:00PM EDT – And starting at $499 for the 2070
02:00PM EDT – Starting at $699 for the 2080
02:00PM EDT – Correctiong on the pricing. Starting at $999 for 2080 Ti
01:59PM EDT – Preorders available now
01:59PM EDT – On shelves everywhere September 20
01:59PM EDT – Oh, sorry, they mean the 2080 Ti
01:58PM EDT – The RTX 2070 at $499
01:58PM EDT – For the RTX 2070, 6 gigarays/s, 8 GB RAM, and totalling 45 tera RTX OPS
01:57PM EDT – Three models: RTX 2070, 2080, and 2080 Ti
01:57PM EDT – ‘And it’s just so quiet’ … 1/5 audio levels of 1080 Ti at max overclock
01:57PM EDT – ‘Designed for crazy amounts of overclocking’
01:56PM EDT – Dual-fan design for reference/Founders Edition
01:55PM EDT – Answer: a new GeForce
01:55PM EDT – ‘I guess your question is this: what are you gonna run it on?’
01:54PM EDT – ‘There are so many other RTX games coming your way’
01:54PM EDT – Another surpise? BF V open beta on September 6
01:53PM EDT – Now a trailer
01:53PM EDT – ‘You guys do destruction great! There’s a lot of pent up frustration there;
01:52PM EDT – A whoop rings through the venue as the V1 missile explosion gets reflected on the crumbling building windows
01:52PM EDT – The old school cube maps are also static. But with ray tracing on, you can see destruction in the reflections.
01:50PM EDT – Ray tracing works with far away objects, too. Like a burning plane on the hood of a car
01:49PM EDT – Or in the windows of the buildings on the side, or on the wood finish of a gun
01:48PM EDT – With RTX on, looking at puddles on the ground will reflect the flamethrower spitting out overhead
01:48PM EDT – Another shot of off-screen explosions being visible on a car
01:47PM EDT – Basically doing what screen-space reflection can’t
01:46PM EDT – Working with NVIDIA for the past year, and with RTX allowing reflections in the irises, of offscene fire
01:45PM EDT – About to show an RTX alpha demo for BF V
01:45PM EDT – Speaking of mayhem and chaos rendered in BF
01:44PM EDT – ‘You guys know how to blow it up’
01:43PM EDT – Bringing up Jonas and Christian from DICE
01:43PM EDT – And now for Battlefield V
01:43PM EDT – A little robot called Belyash? But showing off curved mirror reflections
01:42PM EDT – That’s the mantra of today, really. ‘Raytracing just works’
01:42PM EDT – ‘Everything just works’
01:42PM EDT – It’s a game in development, by the way
01:41PM EDT – ‘Ray traced reflections and inter-reflections’
01:41PM EDT – ‘Everything is gonna be complete dynamic’
01:40PM EDT – Now is a racing simulation
01:40PM EDT – One of the points brought up with global illumination is keeping realistic tension in games where a creepy crawlie might e hiding in the corner. Which works well with games like Metro
01:38PM EDT – Another exclusive trailer for Metro Exodus, no
01:38PM EDT – With RTX on, the dynamic lighting is much more true-to-life. I can’t help but think again about HDR because of the touted mprovements in shadow/brightness contrasts
01:37PM EDT – With a fake ‘global illumination’, there’s constant intensity even in areas of the room that should be darker
01:36PM EDT – Modelling this is very computationally expensive. Otherwise, you could use fake lights, but it wouldn’t be convincing
01:36PM EDT – With GI on, the rest of the room is lit in accordance with how the sunlight bounces off, refracts, and affects ambient ndoor lighting
01:35PM EDT – Another 10 year pursuit to make this possible
01:34PM EDT – Basically modelling non-direct lighting effects. For this window scene, all the bits not under direct sunlight
01:34PM EDT – This time, highlighting global illumination with RTX
01:33PM EDT – Next is: Metro Exodus
01:32PM EDT – (The trailer is rendered in GeForce RTX y the way)
01:31PM EDT – And now, an exclusive trailer for Shadow of the Tomb Raider
01:31PM EDT – ‘Contact hardening just works, penumbra works, everything just works;
01:30PM EDT – Sharp unrealistic shadows vs realistic soft shadows
01:29PM EDT – The difference in lighting and shadows is very noticable. ‘This is state-of-the-art for non-RTX realtime shadows’
01:28PM EDT – ‘The beautiful thing about ray tracing is you turn it on, like a light’
01:27PM EDT – Working with Eidos Montreal to bring real time ray tracing to Tomb Raider
01:26PM EDT – Namely, allowing more realistic soft shadows
01:26PM EDT – Showing what it looks like with RTX on, and how it improves shadows
01:25PM EDT – But now, a Shadow of the Tomb Raider demo
01:25PM EDT – How lucky!
01:24PM EDT – Praising dev. JHH says every devtech in the company will get a new Turing
01:24PM EDT – (What we’ve all been waiting for)
01:23PM EDT – Now: Turing for games
01:23PM EDT – The 1080 Ti can only manage 30 ish fps. The Turing GPU is ‘twice the performance’
01:22PM EDT – One could imagine the lighting benefits in addition to HDR
01:21PM EDT – Unreal Infiltrator
01:21PM EDT – A new demo now, with 4K demonstrating DLSS
01:20PM EDT – ‘Because we can take a lower res image and train a neural network to upscale it, in real-time enhance pixels via tensor ores’
01:20PM EDT – (The hunt against jaggies continues even at 4K)
01:19PM EDT – Comparing the improvement vs TAA at 4K
01:19PM EDT – But this is what will be training DLSS
01:19PM EDT – (DGX-2 is not new, of course)
01:18PM EDT – 2 PFLOPS in the whole thing
01:18PM EDT – Now bringing out a DGX ‘tray’ of GPUs
01:17PM EDT – https://www.nvidia.com/en-us/geforce/20-series/
01:17PM EDT – All part of the RTX platform
01:17PM EDT – This would be pre-generated on supercomputers (DGX) and then used. So it sounds like a new image quality oriented nferencing featureset
01:16PM EDT – Takes a perfect image (ground truth) and trains to get other images as accurate as possible
01:15PM EDT – Neural Graphics Framework, basically uniting graphics and deep learning/AI
01:15PM EDT – New NVIDIA platform: NGX
01:14PM EDT – Training it to create beautiful 4K images from other inputs. Temporally stable, for real-time use
01:14PM EDT – (NVIDIA is leaking)
01:14PM EDT – Using a convolutional autoencoder to make it like RNN (aka neural network with memory)
01:13PM EDT – Creating 100,000s of super HQ images to train a NN
01:12PM EDT – Now talking about NVIDIA DLSS. Deep learning super sampling
01:12PM EDT – Again citing 10x 1080 Tis for the tensor cores’ 110 TFLOPS
01:12PM EDT – Of course, all of this comes back to the Tensor Cores
01:11PM EDT – Basically teaching real-time rendering techniques on GPUs. Taking neural networks and training it with the desired output
01:09PM EDT – And here’s an example with teaching the GPU
01:09PM EDT – Taking advantage of the correction steps in DL (backpropagation, SGD)
01:09PM EDT – Or getting high resolution from a low res image
01:08PM EDT – But applicable to graphics/images, as we know. Colorization, super resolution, as some examples
01:07PM EDT – (Kind of a big deal)
01:07PM EDT – And now for deep learning
01:06PM EDT – Didn’t seem like a steady framerate at times, though
01:06PM EDT – It’s a humorous little showcase with dancing
01:05PM EDT – Putting on powered armor onto an astronaut, suit glistening from the lighting and all that
01:04PM EDT – Demo is called ‘Sol’. Like the Sun
01:03PM EDT – Running on 1 Turing GPU
01:03PM EDT – ‘Things are things that look like things’
01:03PM EDT – Beautiful everything, he says
01:03PM EDT – A new recent demo to highlight the power of mixed mode rendering, rasterization, ray-tracing, compute, AI, all in real-ime
01:02PM EDT – ‘The benefit of RTX is just: turn it on’
01:02PM EDT – Now showing movement within the scene
01:01PM EDT – Basically, all the tricks used to hide inaccuracies, RTX allows you to do it right
01:00PM EDT – And with RTX on, the refractions, lighting, caustics are much better, clear
01:00PM EDT – The first scene is ‘the limits of todays computer graphics’
12:59PM EDT – (Stream viewer count is up to just shy of 210K thus far)
12:59PM EDT – Note the new Quadro design in the corner
12:59PM EDT – Now looking at another scene
12:58PM EDT – ‘This is just the beginning, I’m just warming you up. An olive for the appetizer’
12:57PM EDT – Dull and flat vs bright and shiny
12:57PM EDT – Before-and-after with RTX
12:57PM EDT – Enabling al sorts of accelerated graphics rendering techniques
12:57PM EDT – Showing off some of the efforts by SEED, a group within EA
12:56PM EDT – Now citing raytracing research and development efforts
12:55PM EDT – To create a parallel acceleration structure
12:55PM EDT – It took 10 years to develop this
12:54PM EDT – Once a beam intersects with a bounding box, you can get rid of the other bounding boxes
12:54PM EDT – ‘Binning’ for ray tracing, as opposed to calculating every single affected triangle per ray
12:53PM EDT – Talking about BVH now
12:53PM EDT – Highest end Titan X does 12T RTX OPS
12:52PM EDT – ’78 Tera RTX OPS’
12:52PM EDT – For the first part, 14 FP32 TFLOPS, 110 TFLOPS (10 TFLOPS per Giga Ray)
12:51PM EDT – FP and INT in the second. Then Tensor Cores in the last
12:51PM EDT – The shader and RT are concurrently running in the first part
12:51PM EDT – Generating new info that looks right, generate missing pixels
12:50PM EDT – When everything is done, we can use AI to generate resolution that we otherwise couldn’t have
12:50PM EDT – Without it, the RT step would be 10X larger
12:49PM EDT – ‘We’re ray tracing all the time. That’s the fraction of the time the RT core is doing it’
12:49PM EDT – Now showing the workload inside Turing
12:49PM EDT – So the pipeline has to be different, the way of measurement has to be different
12:48PM EDT – ‘Back in the good old days, all the computation was done in shading. In the future, we’re going to be doing lighting, mage procssing’
12:48PM ED – Pascal is all the way up there at 308ms. Not sure if they mean just one, though
12:47PM EDT – ‘This has simply never happened before, a supercomputer replaced by 1 GPU in 1 generation’
12:47PM EDT – 1x Turing is 45ms
12:47PM EDT – For performance, the DGX with 4 Voltas on the Star Wars demo, it is 55ms (~20 fps)
12:46PM EDT – All of this, of course, that we’ve covered eariler, by the way
12:46PM EDT – 1080 Ti does 1.21 gigarays/s, Turing does 10 gigarays/s
12:45PM EDT – And the RT Core, 10 gigarays per second
12:45PM EDT – variable rate shading for things like foveated rendering
12:44PM EDT – Concurrent FP and INT execution: FP for colors, INT for addresses, for example
12:44PM EDT – SM is completely brand new
12:44PM EDT – This chip is the second largest chip that the world has ever made
12:43PM EDT – ‘The most advanced GPU we’ve ever done, the greatest leap since we created CUDA’
12:43PM EDT – But it was running on Turing, which we’ve been working on for almost 10 years
12:43PM EDT – ‘Good luck with that’
12:42PM EDT – So DGX for 3000 payments of 19.95, of course
12:42PM EDT – ‘Where do we buy the DGX? How do we turn it into a games console?’ JHH on constomer requests
12:41PM EDT – Which took 4x V100s in a $68,000 DGX, 20 ish fps for real time ray tracing
12:41PM EDT – Not practical for a scene of that level of fidelity
12:41PM EDT – You could either fake the shadows or reflection probes/cube maps for every single scene
12:40PM EDT – Using area lights makes it hard to render shadows
12:39PM EDT – Real-time ray tracing*
12:39PM EDT – And here’s NVIDIA’s raytracing demo from GTC
12:38PM EDT – And worked with Epic to implement it into Unreal
12:38PM EDT – We worked with Microsoft to create DXR and RTX
12:37PM EDT – Rasterization: 3D world into 2D data. Ray-tracing: sends out a beam of light, looking for affected triangles
12:37PM EDT – For ray-tracing, it only looks at the number of pixels that reach your eyes
12:36PM EDT – With rasterization, you have to project lighting from each light, meaning for big scenes, you have to model everything
12:35PM EDT – Ray tracing is much more parallelizable than rasterization
12:35PM EDT – Rasterization vs Ray Tracing
12:35PM EDT – We have two main rendering technologies
12:34PM EDT – So we invented the NVIDIA RTX
12:32PM EDT – GTX 8800 is the first CUDA GPU. And the GTX 1080 is the most powerful GPU today
12:32PM EDT – ‘Look at the GeForce 256, it’s so cute’
12:32PM EDT – Basically laughing at Moore’s Law. ‘1000x every ten years’ as opposed to the Moore’s 10X
12:31PM EDT – ‘Because of the demand and scale of video games, it has propelled one of the most fast advancing computer science echnologies’
12:30PM EDT – And ti took 1.5h to render 60 pixels per second
12:29PM EDT – Ray tracing dates back to 1979
12:29PM EDT – (Looks like the livestream doesn’t have a direct feed of the projector, this time?)
12:29PM EDT – There’s all these hardware-intensive techniques to model how lighting bounces and refracts off things
12:25PM EDT – Ray tracing is the holy grail of computer graphics
12:25PM EDT – ‘The good news is, you’re gonna be surprised. … Everything on the web, every spec is wrong. You’re gonna be surprised’
12:25PM EDT – ‘I have never seen anything that leaked this much’
12:24PM EDT – “Welcome to the launch of the GeForce GTX 1180” he says
12:24PM EDT – ‘We
12:24PM EDT – Hey, Jensen! JHH is on the stage
12:24PM EDT – Ryan: Quick note, there are 161K people watching the live stream, according to Twitch stats
12:23PM EDT – And NVIDIA’s tech achievements alongside it
12:22PM EDT – A montage of influential games and developments
12:21PM EDT – And by that I mean ray tracing
12:21PM EDT – An intro video on computer graphics
12:21PM EDT – Lights out time
12:19PM EDT – Ryan: And continuing with that Xbox analogy, I expect we’ll need to apply the usual E3 rules. Dev presentations, some cripted “events”, and enthusiastic crowd managers sitting in the front rows to look good for the cameras and to try to rev up the crowd
12:19PM EDT – Starting with a remix of “Hold On, I’m Coming”
12:17PM EDT – The event is about to begin
12:12PM EDT – Not much going on at the moment, just the music pumping
12:11PM EDT – After an hour or so of milling around outside, we’re in! We’ve not been given wifi credentials, but thanks to our Tom’s ardware colleague Igor we’re up and running
12:10PM EDT – Speaking off the cuff here, it really feels like NVIDIA is pulling from Microsoft’s Xbox playbook this year. I’m not sure f this is a good thing or not
12:08PM EDT – So if the flashy venue hasn’t already made it clear: at the end of the day, NVIDIA is in the business of selling gamers GPUs
12:06PM EDT – NVIDIA, being the masters of hype, have been heavily promoting this event for the past week as well
12:05PM EDT – For thsoe of you that want to watch it live, NVIDIA is livestreaming this on Twitch: https://www.twitch.tv/nvidia
12:04PM EDT – Nate has just taken a seat (Nate, please give it back when you’re done)
12:04PM EDT – We’re now past 6pm local, and the event still hasn’t kicked off yet
12:04PM EDT – So I’m not going to underestimate NVIDIA here
12:03PM EDT – Of course, the last time NVIDIA was in this situation, we got Maxwell, which was an absolutely amazing GPU architecture
12:01PM EDT – So NVIDIA doesn’t get a full generation’s density improvements
12:01PM EDT – Meanwhile, the 12nm process NVIDIA has used for at least the largest Turing GPU is not a full node’s improvement over the 6nm process used on Pascal
11:59AM EDT – So it becomes a question of whether NVIDIA is willing to eat larger die sizes to include them in numbers large enough what hey’re useful on this generation of hardware, or play it safe and include only small numbers of cores, essentially as a hardware ompatibility mechanism for developers, with eyes on actual use in the next generation of products
11:58AM EDT – They don’t directly contribute to tranditional rendering, however NVIDIA is trying to push an entire paradigm change here ith ray tracing and neural networks
11:57AM EDT – The follow-up question to that being just how many Tensor and RT cores make it into the smaller GPUs
11:55AM EDT – Gamers will need something a bit smaller – and therefore more affordable – so I’m expecting to see at least one more uring GPU announced today
11:55AM EDT – The oe Turing chip we’ve seen so far, which is powering the first Quadro RTX cards, is absolutely massivee. 754mm2
11:54AM EDT – And until the presentation kicks off, you can read our first Turing article here: https://www.anandtech.com/show/13214nvidia-reveals-next-gen-turing-gpu-architecture
11:53AM EDT – Either way, it’s clear that we should be expecting a more gaming-focused take on the new Turing hardware, including some but not a ton) of architectural information about what makes Turing faster for gaming, and information on Turing-powered GeForce cards
11:51AM EDT – So either this will turn out to be a masterful play at subterfuge, or everything you’ve heard over the last 72 hours is rue
11:51AM EDT – Adding fuel to the fire, the moment NVIDIA informed board partners and retailers about the upcoming products, they all tarted leaking like sieves
11:50AM EDT – However last week’s Quadro RTX & Turing GPU announcement makes it pretty clear what we should be expecting here: a Turing-elated GeForce announcement
11:49AM EDT – Formally, NVIDIA is promising “hands-on demos of the hottest upcoming games, stage presentations from the world’s biggest ame developers, and some spectacular surprises” for this event
11:49AM EDT – For the event itself, NVIDIA CEO Jensen “The more you buy, the more you save” Huang will once again be taking the stage at this GeForce-branded event
11:48AM EDT – The press is still getting seated, so NVIDIA is probably not going to start right at 6pm local time
11:48AM EDT – Alright, we’re getting set for what should be NVIDIA’s biggest gaming-related announcement of the year

Imágenes———————————————————————
*****************************************************************************

16 Replies to “NVIDIA GeForce Gamescon 2018 keynote live blog”

  1. Richard

    bueno caballleros ya estan por hay las pruebas de las nuevas RTX y tampoco son la gran cosa ni se ve q sea una locura en rendimiento pero se dice ya q estas RTX son buenas para grabar pantalla q no pierden nada parece q ese es el publico a quien van

    Responder
    • Maikel Autor del Post

      vermos cuando salgan a la venta las pruebas y datos de rendimiento reales, asi como quienes saquen algo en juegos que de veras se aproveche y el resultado se note. En mi opinion, casi nada de lo que entregan se aproveche de veras hasta dentro de par de años. de inmediato, solo alguna que otra cosa, como se vio en la presentacion
      pero por ahora las pruebas filtras muestran que la ganancia de rendiminto no esmucha, y si se habilita las opciones de raytrace en los juego que han sacado demo de la tech, la ganancia es 0 o pierde un poco. Claro, se gana en visualidad, en las opinionesque he visto de los presentesque han probado el BFV en las PC dsiponibles en el evento dicen que la diferencia visual es notable.

      Responder
      • Richard

        bueno en lo q de nada de lo q saquen se aprobecha en un par de años no estamos de acuerdo recuerdas la presentacion de 10xx Pascal se veia y de lejos la diferencia con Mxwell pero en Pascal y Turing al menos con lo poco q se ha visto no ha mejorado tanto ni graficos reinventados ni nada de eso hay cosas q si se aprobechan en realidad al pasar de el tiempo pero el hardware Gaming no entra por q mentira q si te compras un prosesador o grafica en este caso tienes q esperar 2 años para q te de el rendiminto q deveria mal maik y cuando salgan mas rebiws le veras.
        -20 reputacion
        mision fail

        Responder
        • Maikel Autor del Post

          Si, se aprovechan, pero muy parcialmente ahora, en fin, esta empezando este tema ahora. la cosa seguira siendo mixta, rasterizacion + raytrace. pero a nivel amplio de aprovechamiento, demora todavia, casi siempre es asi, una tecnologia en este campo aparece muy limitada al inicio y al par de a;os es que se ve mas generalizada o empieza a madurar.

          A proposito, el raytrace en juegos es bastante viejo, pero nunca se ha usado en juegos de produccion, siempre en demos y experimentales, y algunas cosas basicas, y simplificado, se ha usado en aspectos de ilumnacion global, texturas desplazadas y reflejos y sombras en tiempo real. Ya se veran mas demos avanzados, pero en produccion, la cosa de demora, de veras. Recuerda que es una tecnologia a aprovechar nueva, todo lo que se ha hecho hasta ahora desde mediados de los 90 es lo mismo, rasterizacion, pero tirandole mas potencia de computo y eficiencia. En hardware como tal creo que lo unico importante que habia aparecido estos 3-4 ultimos años ha sido compresion de memoria (reducir ancho de banda necesario, menor bus, menor consumo) y datos empaquetados (computo a presicion reducida y/o adaptativa, mas rendiminto, menos consumo). Todo lo demas es o ha sido a base de software, en los drivers y junto con lo que hacen los desarrolladores en los motores graficos. Usar mejor el hardware es mas compicado que usar mejor un software. Mira a DX12 y Vulkan, ya llevan unos 3 años casi 4 de salida, y todavia pocos lo usan o lo aprovechan bien, solo los que de veras le meten y especializan son los de sarrolladores de juegoss AAA, hay mas $ para especialistas, y demostrada esta su capaciad de dar mas rendimiento o escenas mas complejas y realistas. Se ha ido incorporando a los motores de juego, pero asi y todo, todavia, no se aprovecha tanto. Y no es por compatibilidad de hardware, funcionan en graficos incluso de hace casi 8 años, hasta en graficos integrados. Es como el asunto de los motores de juego aprovechar mejor varios nucleos en CPU, pues solo alguno que otro llega a usar +- bien 4. De los Ryzen para aca es que se esta haciendo mas fuerza al tema. En las consolas ya se hacia desde siempre, pero bueno, las consolas son hardware fijo, no hay mucho enrredo con ellas..

          Escribir un raytracer a software es muy facil, pues es una operacion sencilla, claro con ilumnacion basica. Yo mismo me lo hice hace mas de 15 años de experimento, con animacion casi en tiempo real con algunos trucos de optimizacion y demas, claro depende del harware, tengo el ejecutable por ahi, luego te lo pongo de ejemplo.

          A proposito, voy aponer un articulito con uno de los videojuegos que hice alguna vez, jjejejej

          PD: me costo entender un poco lo que quisiste decir, escribe con calma, jajaja

          Responder
          • Richard

            q direct 12 no se usa casi bueno y las XboX q usan segun tengo entendido es eso y Vulkan no lo va a usar nadie nunca y si te refieres a q D11 sigue por ensima de D12 es por q los desarrolladores llevan mas usando el segundo por W7 ya q W7 no es compatible con D12 esa es la razon principal y lo d q D11 rinde mejor q D12 es un mito ya q rinden casi casi lo mismo
            PDPD: nvidia lanso un nuebo hard el raytracker ya existia por eso hablo de el hard y los de las cpus con mas de 4 nucleos no es q las de 6 rindan peor es q le sacan lo mismo al game ya q los juegos no estaban pensados para contar con todos esos nucleos pero los presesadores ya rinden bien no es q tengas q esperar 2 años para q sea mejor ya e en ese tiempo si bien la base de la tecnologia ba a estar mejor desarrollada no sera en ese mismo producto de hoy sera en los recien presentados.

          • Maikel Autor del Post

            -Las Xbox One y la X usan primariamente, DX11.algo (incluso, cosas de DX10). Hay juegos que lo hacen con DX12, pero son casi los mismo que lo hacen en PC.
            -Si, es a lo que están acostumbrados, ademas de lo que a taraen los motores, pero a los principales se les ha ido inconrporando soporte para facilitar su uso sin especializacion.
            -DX12 rinde mejor que DX11 cuando se trabaja bien y usa como debe, el motro se le adapte, y el hardware esté mas preparado para ello, si no, poco se gana.
            -Vulkan lo usan unos cuantos, varios motores lo tiene incorporado y puede usarlo. Los mejores ejemplos de juegos con Vulkan, Doom y Wolfenstein II, otros, el DotA2, etc, ve y compara el rendiminto con respecto a OpenGL (OpenGL y Vulkan, es como DX11 y DX12) mira algunos aqui si puedes, https://en.wikipedia.org/wiki/List_of_games_with_Vulkan_support, incluso algunos estan preparando para reemplazar DirectX completamente, otros los veras en las versiones para linux. Si miras esta, vez que noes es tanta la dif http://www.wikiwand.com/en/List_of_games_with_DirectX_12_support

          • Richard

            si en eso estamos de acuesdo y ya q muestras la wikipedia hablarte de ella ya q has visto si cambias el idioma como da resultados diferentes como las temperaturas de el planeta o el agua la wiki es genial sin duda pero hay q leer todi y bien para no tener ese tipo de errata

          • Maikel Autor del Post

            Si miras la de español (un asco, auqnue mejorando un poco), es mas asi, montones de errores y desactualizaciones, personalmente la que veo es la ingles, es mas dificl encontrar esos problemas. Te la doy esa por facilidad, para eso Khronos Group y Microsoft tiene paginas de seguimiento del tema. Por supuesto, siempre es vital contrastar fuentes, referencias y valoraciones. Algo se aprende entre tesis de grado, tutorias, consultorias, preparacion de maestria y doctorado, una “pesadez”, jajaja.
            Y aprovecho y te doy una referencia propia. Desde hace como 7 años me dedico al desarrollo de sistemas acelerados en GPU y CPU y etc, programo las GPUs (AMD/INTEL/NVIDIA/Qualcom etc) por dentro y con OpenCL (los CPU igual), es decir, llevo fajado con ellas en profundidad bastante tiempo, y siguiendo detalladamente arquitecturas y software, algo de experiencia se gana, supongo. Si necesitas algo del tema o algun desarrollo, avisame, te tiro el cabos.

          • Maikel Autor del Post

            Hey mira!, de aqui aprendi a hacer raytracing, en el tomo 2 de este libro, tambien conocido como el “FoleyVanDam”, esta edicion de 1994. Y una de las imagenes que nvidia puso en la presentacion, mirar en la parte de imagenes a color del libro.
            https://tecn.cubava.cu/files/2018/08/CCcover.jpg
            https://tecn.cubava.cu/files/2018/08/CCrt1.jpg
            https://tecn.cubava.cu/files/2018/08/CCimgs.jpg
            Unos demos de raytracing que escribi alla por 2004, aprendiendo del libro y experimentando, los más actualizados (reflejo, sombras y refracciones OK, y tratando que se mantuviera “rapido”) no los encuentro, pero tengo el código, lo que no tengo ahora es Delphi 7 para compilarlos.
            https://tecn.cubava.cu/files/2018/08/demosRT.rar
            En los 90 habia una competencia que era ver quien escribia un raytracer con la menor cantidad de caracteres posible en el codigo. tengo por ahi lo tengo, caundo lo encuentr te lo muestro
            aaa,no, claro que esta en internet, y mejor descrito, que no hay ahi? jajaja
            miralo aqui,
            https://www.teamten.com/lawrence/projects/shortest_ray_tracer/
            http://fabiensanglard.net/rayTracing_back_of_business_card/
            la ostia!, pura ofuscacion, jjjjjj

            mmm, interesante poner un articulo sobre el concurso IOCC , ahi hay joyas

  2. Oliver

    999 for 2080 Ti Eso allá pero en Cuba costara el doble por lo menos aunque espero q no suba tanto por lo menos 1200 o 1500 jajá pero aun así no llega al bolsillo O_o

    Responder
  3. Nicanor FCB

    02:00PM EDT – And starting at 499 for the 2070
    02:00PM EDT – Starting at 699 for the 2080.

    not bad i think…..as a start i mean.
    ahora los ¨intermediarios¨ van a lo suyo.

    Responder

Deja un comentario

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *