Amdar escribió:Me surgen varias dudas...
Teniendo vulkan y dx12 el desarrollador creo yo que optara por vulkan porque el juego sera portable a más plataformas sean Linux, Mac. ¿Se sabe algo de la implementación de vulkan en dichos sistemas principalmente Linux? Esto puede ser un puntazo para Linux ya que le daría un buen empuje. ¿Se sabe cuando habrá drivers que soporten vulkan para nvidia y amd? En definitiva ¿tenéis algo más de información sobre vulkan en los otros sistemas?
Dfx escribió:Vulkan le puede dar un empuje a Linux, pero ten en cuenta que las desarrolladoras van a usar DX12 como base si o si por que básicamente representa mas de un 90% del mercado y Microsoft hará presión allí donde haga falta.
Lo mas probable es que sean las Steam Machine lo que revitalice de verdad la plataforma Linux y le de esa oportunidad tan esperada de competir tu a tu contra Windows, pero todo esto esta en pañales todavía.
josemurcia escribió:Dfx escribió:Vulkan le puede dar un empuje a Linux, pero ten en cuenta que las desarrolladoras van a usar DX12 como base si o si por que básicamente representa mas de un 90% del mercado y Microsoft hará presión allí donde haga falta.
Lo mas probable es que sean las Steam Machine lo que revitalice de verdad la plataforma Linux y le de esa oportunidad tan esperada de competir tu a tu contra Windows, pero todo esto esta en pañales todavía.
Te doy la razón en que Microsoft se sacará la cartera. Pero, ¿a qué te refieres con que representa más del 90% del mercado?
Si te refieres a Windows, Vulkan también es compatible con Windows de la misma forma que DX12. Es más, mientras DX12 es solo para W10, Vulkan también podrá funcionar en versiones anteriores.
papatuelo escribió:
The card has two 8-pin connectors, so its definitely drawing a lot of power.
papatuelo escribió:
Direct3D 12 feature checker (May 2015) by DmitryKo
https://forum.beyond3d.com/posts/1840641/
Using minimum feature level 11_0
ADAPTER 0
"AMD Radeon R9 200 Series (Engineering Sample - WDDM v2.0)"
VEN_1002, DEV_67B0, SUBSYS_30801462, REV_00
Dedicated video memory : 3221225472 bytes
Total video memory : 4294901760 bytes
Maximum feature level : D3D_FEATURE_LEVEL_12_0 (0xc000)
DoublePrecisionFloatShaderOps : 1
OutputMergerLogicOp : 1
MinPrecisionSupport : D3D12_SHADER_MIN_PRECISION_SUPPORT_NONE (0)
TiledResourcesTier : D3D12_TILED_RESOURCES_TIER_2 (2)
ResourceBindingTier : D3D12_RESOURCE_BINDING_TIER_3 (3)
PSSpecifiedStencilRefSupported : 1
TypedUAVLoadAdditionalFormats : 1
ROVsSupported : 0
ConservativeRasterizationTier : D3D12_CONSERVATIVE_RASTERIZATION_TIER_NOT_SUPPORTED (0)
MaxGPUVirtualAddressBitsPerResource : 38
StandardSwizzle64KBSupported : 0
CrossNodeSharingTier : D3D12_CROSS_NODE_SHARING_TIER_NOT_SUPPORTED (0)
CrossAdapterRowMajorTextureSupported : 0
VPAndRTArrayIndexFromAnyShaderFeedingRasterizerSupportedWithoutGSEmulation : 0
ResourceHeapTier : D3D12_RESOURCE_HEAP_TIER_2 (2)
Adapter Node 0: TileBasedRenderer: 0, UMA: 0, CacheCoherentUMA: 0tUMA: 0
SashaX escribió:Que AMD saque ya las 390/X y que por favor, sean un poco menos calientes y tragonas.
Si cumplen esas 2 premisas, se vienen 2 p'aquí.
Edito
http://www.guru3d.com/news-story/amd-fiji-xt-photo.htmlThe card has two 8-pin connectors, so its definitely drawing a lot of power.
D'OH!
Amdar escribió:Me surgen varias dudas...
Teniendo vulkan y dx12 el desarrollador creo yo que optara por vulkan porque el juego sera portable a más plataformas sean Linux, Mac.
josemurcia escribió:Muy a mi pesar, tenemos un nuevo contendiente, Apple en lugar de abandonar Metal en iOS y adoptar Vulkan en este y OSx, va a hacer lo opuesto, adoptar Metal en OSx y no dar soporte a Vulkan.
TRASTARO escribió:Apple se lo pierde [aunque los entiendo]
Como sea, esta tabla parece la mas completa y veraz sobre el tema de Soporte a las FEATURES de Dirct3D12/DXGI12..
.
microsoft escribió:Direct3D feature levels
To handle the diversity of video cards in new and existing machines, Microsoft Direct3D 11]introduces the concept of feature levels. This topic discusses Direct3D feature levels.
Each video card implements a certain level of Microsoft DirectX (DX) functionality depending on the graphics processing units (GPUs) installed. In prior versions of Microsoft Direct3D, you could find out the version of Direct3D the video card implemented, and then program your application accordingly.
With Direct3D 11, a new paradigm is introduced called feature levels. A feature level is a well defined set of GPU functionality. For instance, the 9_1 feature level implements the functionality that was implemented in Microsoft Direct3D 9, which exposes the capabilities of shader models ps_2_x and vs_2_x, while the 11_0 feature level implements the functionality that was implemented in Direct3D 11.
Now when you create a device, you can attempt to create a device for the feature level that you want to request. If the device creation works, that feature level exists, if not, the hardware does not support that feature level. You can either try to recreate a device at a lower feature level or you can choose to exit the application. For more info about creating a device, see the D3D11CreateDevice function.
Edit: This post originally referred to Oland as a GCN 1.0 chip. It’s at least provisionally GCN 1.1, though it lacks certain features traditionally identified with that architecture, like TrueAudio support.
josemurcia escribió:Ese articulo de ExtremeTech se basa en información errónea, para empezar la tabla que dicen que es de microsoft pero no de donde la han sacado y esta completamente mal.
Oland es GCN 1.0, no se de donde se sacan lo contrario.
GRAPHICS FEATURES
DirectX/F_L: 12.0/11_1
OpenGL: xxx
OpenCL: xxxx
O
GRAPHICS FEATURES
DirectX: 12.0
Features_Level: 11_1
OpenGL: xxx
OpenCL: xxxx
TRASTARO escribió:Vamos para atras, ahora lo quitan de ser un hilo FIJO.
Warhammer 40,000: Inquisitor Martyr developers on the benefits of the upcoming API and why they went multiplatform.
NeoCore Games, the developers behind the upcoming Warhammer 40,000: Inquisitor Martyr have a great experience in developing games for PC. In a recent interview with the development team at NeoCore, GamingBolt asked their thoughts on DirectX12 and how it will speed up and enhance game visuals on PC and Xbox One.
“Direct X 12 is not something that allows you to play around with new shiny effects during the development. It’s more of a big, structural development that gives the programmers a lot more opportunity by providing them easier access directly to the GPU,” Producer, Zoltán Pozsonyi explained to GamingBolt. “This means that they’ll have a lot more wiggle room for performance optimization and making the code run faster; if you’re able to squeeze out higher performance out of the GPU, that can be translated into framerate or more beautiful content in the game. DX12 supports asynchtonous compute shaders, which for example allows you to use more and better quality special effects and post process stuff, a lot faster (screen space ambient occlusion, screen space reflection, better quality shadow mapping, transculency, tone mapping).”
Warhammer 40,000: Inquisitor Martyr will be heading to the PS4 and Xbox One alongside PC and Mac when it launches next year. What kind of challenges did the development team faced while developing for consoles and what prompted the decision to go beyond the PC for this release?
“Consoles have always been on our radar, as you might have heard The Incredible Adventures of Van Helsing trilogy is also coming to Xbox One and PlayStation 4,” Lead Narrative Designer, Viktor Juhász states. “The new console generation’s architecture is very similar to the PCs and the machines are also powerful enough, so our biggest challenge is to make the controls feel just as intuitive with controllers, as it is with a keyboard and mouse setup.
In both the PC and console versions we wanted to find the balance between being faithful to the lore and to the requirements of a great ARPG. We had to overcome some serious technological challenges when we decided to revamp the engine. We still have a long way to go, like the creation of the persistent open world.”
Warhammer 40,000: Inquisitor Martyr will be out in 2016 for the PS4, Xbox One and PC. Stay tuned for our full interview in the coming days.
paconan escribió:@trastaro te voy a canear...
como es que no has avisado de esto!!!!
http://www.guru3d.com/news-story/unreal ... -demo.html
http://elchapuzasinformatico.com/2015/0 ... -4-y-dx12/
Dfx escribió:Yo sigo sin ver donde se quiere llegar con ZEN si va a ser según la información que se ha dado hasta ahora (50% mas de IPC, 2 hilos por core, 16 core?, HBM...).
A mi lo unico que me llama la atención es que se habla de HT inverso, que eso si podría ser una autentica revolución y parce que tanto Intel como AMD están trabajando en ello.
microsoft escribió:GPU virtual memory in WDDM 2.0
This section provides details about GPU virtual memory, including why the changes were made and how drivers will use it. This functionality is available starting with Windows 10.
Introduction
Under Windows Display Driver Model (WDDM) v1.x, the device driver interface (DDI) is built such that graphics processing unit (GPU) engines are expected to reference memory through segment physical addresses. As segments are shared across applications and over committed, resources gets relocated through their lifetime and their assigned physical addresses change. This leads to the need to track memory references inside command buffers through allocation and patch location lists, and to patch those buffers with the correct physical memory reference before submission to a GPU engine. This tracking and patching is expensive and essentially imposes a scheduling model where the video memory manager has to inspect every packet before it can be submitted to an engine.
As more hardware vendors move toward a hardware based scheduling model, where work is submitted to the GPU directly from user mode and where the GPU manages the various queue of work itself, it is necessary to eliminate the need for the video memory manager to inspect and patch every command buffer before submission to a GPU engine.
To achieve this we are introducing support for GPU virtual addressing in WDDM v2. In this model, each process gets assigned a unique GPU virtual address space in which every GPU context to execute in. An allocation, created or opened by a process, gets assigned a unique GPU virtual address within that process GPU virtual address space that remains constant and unique for the lifetime of the allocation. This allows the user mode driver to reference allocations through their GPU virtual address without having to worry about the underlying physical memory changing through its lifetime.
Individual engines of a GPU can operate in either physical or virtual mode. In the physical mode, the scheduling model remains the same as it is with WDDM v1.x. In the physical mode the user mode driver continues to generate the allocation and patch location lists. They are submitted along a command buffer and are used to patch command buffers to actual physical addresses before submission to an engine.
In the virtual mode, an engine references memory through GPU virtual addresses. In this mode the user mode driver generates command buffers directly from user mode and uses new services to submit those commands to the kernel. In this mode the user mode driver doesn’t generate allocation or patch location lists, although it is still responsible for managing the residency of allocations. For more information on driver residency, see Driver residency in WDDM 2.0.
GPU memory models
WDDM v2 supports two distinct models for GPU virtual addressing, GpuMmu and IoMmu. A driver must opt-in to support either or both of the models. A single GPU node can support both modes simultaneously.
GpuMmu model
In the GpuMmu model, the video memory manager manages the GPU memory management unit and underlying page tables, and exposes services to the user mode driver that allow it to manage GPU virtual address mapping to allocations.
IoMmu model
In the IoMmu model, the CPU and GPU share a common address space and page tables.
For more information, see IoMmu model.
Michael Larabel, phoronix.com escribió:NVIDIA is readying their Vulkan drivers for a same-day release and on the Windows side they've already begun exposing some of the Vulkan interface.
NVIDIA's 358.66 driver for Windows adds a few OpenGL runtime changes referencing VK (Vulkan) and there's a new nv-vk32 library that exposes a number of Vulkan functions. Beyond that, this driver sets new runtime capabilities in OpenCL for the forthcoming Pascal and Volta GPUs.
TRASTARO escribió:Uy, y a saber hasta cuando AMD sacara controladores que soporten Vulkan para windows/linux.