AMD’s Radeon RX Vega 64 is finally out after the long wait, promising to offer stiff competition with what’s become the new standard for high-end VR, NVIDIA’s 16nm Pascal architecture-based GTX 1080.

Created as a replacement for the Radeon Fury X and Radeon Fury, the RX Vega 64 is manufactured based on the 14nm FinFet architecture, incorporating 64 compute units and 4096 stream processors with a base clock speed of 1247 MHz (1546 MHz under load). Critically, AMD’s new card doubles Fury’s RAM from 4GB to 8GB of the company’s second generation of High Bandwidth Memory (HBM2).

We stacked up the GTX 1080 against the GTX 980ti in a head-to-head VR benchmark, something worth looking into if you’re not familiar the with NVIDIA’s consumer-grade graphics card. Needless to say, the GTX 1080 has become a high-end go-to for VR systems for a reason—it can chew through nearly anything current games can offer on high settings at a reliable 90 fps.

While we’ve thoroughly tested the GTX 1080, we haven’t had a chance to test out the RX Vega 64 for ourselves, but our friends over at EuroGamer maintain AMD’s newest GPU it’s competitive enough with Nvidia, but critically “offers no knockout blow – in the here and now, at least.”

image courtesy AMD

Similarly, a review by TechAdvisor contends the RX Vega 64 performs well out of the box for VR and 4K gaming, closing in on GTX 1080’s performance only by just a handful of frames per second depending on the game and quality settings, but also has a greater requirements in both power and cooling departments than the GTX 1080.

Coming in both liquid-cooled and air-cooled varieties, the Vega 64 gulps down the wattage, with a typical board requiring 345W (liquid) and 295W (air)—nearly double the GTX 1080’s base 180W max load.

SEE ALSO
Samsung Reportedly Deepens XR Ties with Google in Push for Ray-Ban Smartglasses Competitor

Initially announced at a starting MSRP of $500 for the air-cooled card, the RX Vega 64 is supposed to be less expensive than the GTX 1080 was after its May 2016 launch date at $599 MSRP. Realistically though, online retailers are currently selling the cards at a heavy markup, coming in at $650-$700. You can probably thank cryptocurrency miners for that.

Despite a high introductory price, what remains to be seen is whether the GPU can eventually justify the price tag once developers get a chance to optimize their programs around the card’s new features. Only time will tell though, as the GTX 1080 has had more than a year for developers to work out all the best ways to squeeze every last drip of performance out of the card.

What about VR?

VRMark, the VR benchmarking software from Futuremark, offers a few options for testing GPU performance, including the Orange Room and the Blue Room benchmarking tests.

SEE ALSO
'Starship Home' Review – Gardening Across the Universe in Quest's Most Compelling Mixed Reality Game

According to Futuremark, The Orange Room benchmark certifies whether your system can meet the strenuous requirements for the HTC Vive and Oculus Rift. The Blue Room however is a more demanding test, designed to benchmark the latest graphics cards by pushing 5K rendering resolution and volumetric lighting effects.

Here you can see the RX Vega 64 compared to both GTX 1080 and GTX 1080ti:

The RX Vega 64 also supports AMDs Vulkan API, DirectX 12.1, and is a ‘Radeon VR Ready Premium’ GPU, a class of hardware that AMD contends “meet[s] or exceed[s] the Oculus Rift or HTC Vive recommended specifications for graphics cards.”

At this point, you probably shouldn’t pull out the big bucks for a card that hasn’t proven itself worthy against the GTX 1080 just yet. If you’re looking to up your rendering power on a dime, we’d wait until the cards sell at (or below) the initially advertised MSRP before making any hasty purchases. Admittedly, AMD offers a number of cards in their ‘Radeon VR Ready Premium’ class that are much easier on the wallet, including the $450 RX 480 (4GB). If you’re determined to build an AMD-only system, check out the list of the company’s GPUs here.

AMD Radeon RX Vega 64 Specs:

Interface

Interface: PCI Express 3.0

Chipset

GPU: Radeon RX Vega 64
Core Clock: 1247 MHz
Boost Clock: 1546 MHz
Stream Processors: 4096 Stream Processors

Memory

Memory Speed: 945 MHz
Memory Data Rate: 1.9 Gbps
Memory Size: 8GB
Memory Interface: 2048-Bit
Memory Type: HBM2

3D API

DirectX: DirectX 12
OpenGL: OpenGL 4.5

Ports

HDMI: 1 x HDMI 2.0b
Multi-Monitor Support: 4
DisplayPort: 3 x DisplayPort 1.4
Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.


Well before the first modern XR products hit the market, Scott recognized the potential of the technology and set out to understand and document its growth. He has been professionally reporting on the space for nearly a decade as Editor at Road to VR, authoring more than 4,000 articles on the topic. Scott brings that seasoned insight to his reporting from major industry events across the globe.
  • Alorwin

    Fucking crypto-miners… All they’re doing is justifying the shitty pricing-policies of nvidia.

    • Xron

      ~China is clamping down on smaller bitcoin trading channels~ So Miners might get a heavy hit from it, china has ~45% of coin mining under control.

    • Tom_Craver

      Seriously doubt it’s crypto-miners. They’ve got dedicated hardware these days that is thousands of times faster and cheaper per crypto-calculation (hash).
      Pricing is more likely due to gamers who eagerly buy the latest/fastest GPU for bragging rights, no matter what.

      • Alorwin

        You’re thinking bitcoin, etherium and similar cryptocoins need GPUs.

        So yeah: Fucking crypto-miners.

  • AndyP

    Smaller manufacturing process, but more energy/heat for similar performance – oh dear!

    • Petar Posavec

      That’s because of several factors:
      1. AMD is using a GLOFO manuf. process suitable for low clock speeds.
      2. Nvidia is on a manuf. process suited for high clocks. That’s why they were able to overclock Maxwell to current levels with the efficiency at it’s disposal.
      3. AMD frequently overvolts their GPU’s. The standard voltage is 1.2V at maximum.
      Nvidia optimized its voltages out the door thanks to auto-voltage tuning (AMD doesn’t have that).
      However, when Vega 56 for example was undervolted and overclocked on the core to 1613 MhZ and HBM overclocked to about 950 MhZ, it reached and surpassed GTX 1080 while drawing less power than 1080.
      Even an overclocked 1070 can’t match that, and Vega 64 sits closer to 1080Ti.
      https://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/44084-amd-radeon-rx-vega-56-und-vega-64-im-undervolting-test.html

      Also worth of note is that Vega architecture had 0 developer support.
      So, its mostly brute forcing its way through most games as is, however, it also shoots past 1080ti and Titan in various professional software (indicating the hardware is more than capable).
      Plus, there’s the fact that Vega has a lot more processing units than Pascal does – this by default raises power consumption… if you check GTX 1080, it has about 500 less gpu cores than Vega 64.

      4. AMD achieves same or better performance than Nvidia using lower clock speeds. If they were able to use the same manuf. process like Nvidia, and clocked similarly high, it would smash the Pascal lineup from brute force alone.

      Overall, the conclusion is that AMD does have an efficient architecture, but are limited due to lower budgets prohibiting them from optimizing the voltages out the door and use a manuf. process suited for low clock speeds.
      When voltages are set to similarly low levels like Nvidia via Wattman, Vega is quite efficient.

      Right now, most of the efficiency and performance tuning is left to the users (hence the availability of Wattman) and the fact that games didn’t optimize for Vega (plus the infinity fabric in the gpu is not optimized for games either according to Raja Koduri – so we can expect the drivers to fix this most likely).
      Optimizations can make or break a product.
      Ryzen easily saw up to 30% increase in performance from developers releasing some quick patches to support the architecture in question (but in order to take full advantage of it, devs will likely need to code for Ryzen during game developement as they do for Intel).

      Need to keep these things in mind in order to see the distinction between AMD and Nvidia.
      AMD is not by any stretch a bad company with bad products.
      Right now its the industry that needs to catch up, but also AMD has to implement better voltage control/optimization out the door for their gpu’s, because the current settings are really portraying them in a bad light unnecessarily.

      Its quite nice to see Vega performing as it does right now without developer optimizations.
      It will be interesting to see what those enhancements bring.
      I’m thinking that titles which might release patches to support Vega in one way or another will be AMD sponsored titles such as Hitman, DeusX, AOTS, etc. (but this is strictly a hypothetical maybe… not a definitive).

      • Maurice Fortin

        AND Nv trimmed back much of the extras so difference between TSMC and GF are NOT readily apparent, hard to do a side by side to see the difference, Maxwell vs Pascal Nv trimmed back quite a bit to ramp up the clock speeds among other things so is again not directly comparable, the only way the difference could be seen would be to do maxwell and pascal on same exact process or the older R9 Fury and newer Vega on TSMC 16nm type thing.

        just IMO maybe GF is not as refined for the high clocks, but there is also a very distinct difference in the way Nv “optimizes” by trimming back their design to ramp up clocks (less under the hood for more $$ out of pocket to end user less “sturdy” as well cause not near the same threshold for temperature stability they like to rely on fancy circuits to adjust amp/temperature/clocks and such) clock for clock Polaris/Vega are a superior product really BUT are held back via GF vs TSMC design (not much AMD could do about this either as to shift the lions share of production over would cost many millions upon millions due to wafer agreement)

        They guys below me, and other market places need to use their damn brain lol, least 1 guy had right crypto mining whereas to say modern graphics cards for bitcoin mining is asinine they really have not done this for the past 2 years minimum cause ASIC are dedicated for this task ALSO the more a product gets bought technically the better in the end sucks for now but gives the company who makes them more $ to further RnD for the next product type deal.

        As for Nv shitty pricing policies, that has NOTHING to do with crypto mining at all, they are just shitty pricing always have been (especially considering the lack of “quality” in their design i.e VREG/VRM basically built for bare minimum instead of worst expected does not take VREG/VRM all that much to hit the 85-105c range when they are usually in very terrible airflow locations)

        Everybody buys food as well but if no one was buying and YOU wanted a specific something to much on it would cost much more i.e supply and demand, hurts a bit till things stabilize but they will, the seller greed however never will. if AMD/Intel/Nv or whatever say the product should be on the shelf for X in north america that is what it should be IMO but greed from sellers saying “we had to push up the price because they are selling too fast” is moronic.

        • AndyP

          The thing I found REALLY annoying: I had a 780Ti when CV1 was released – worked like a dream. BUT, they released drivers that crippled the performance of the old 780 Ti (in the same games!) and turned my investment in high end GPUs worthless. I won’t forgive the cynical money grubbing b*****rds at Nvidia for that any time soon (nor the shameless waste of resources in these days of ‘austerity’ and environmental degradation).

          • Petar Posavec

            If you are unsatisfied with Nvidia and are aiming to get a higher end GPU (unless you already have it), then go with Vega 56.
            Undervolt both the core and VRAM (VRAM needs to be downvolted to about 950mV in order to allow core voltages to be applied – but final 2 core P states shouldn’t be clocked the same… just put in a minor difference – they can both have same undervolt… say, put it at 1100mv for starters, test to check for stability, and if its stable, then try downvolting last two P states to say 1050mV.
            Oh and HBM/VRAM should be overclocked to 950 MhZ on Vega 56 (you could try higher, but I don’t know if it will allow it – for some even 950 MhZ was unstable, so they did about 900 and went progressively higher).
            Oh and power limit should be set to about 25% or 50%.

            VRAM voltage in Wattman seems to behave like another core voltage setting for Vega… this is because both the HBM and cores are sharing the same die… it’s also why some undervolts from other users were not successful (because they only altered the last 2 P states without changing the VRAM voltage).

            That’s just if you decide to go that route.
            :-)

            Your call in the end.

          • AndyP

            Thanks. It’s not the performance I was highlighting, I have an overclocked 1080 that’s more than adequate. It’s the fact that they either deliberately undermined or just failed to support the 780Ti card that was just fine. Guess this is how it is today, but I think it sucks when you invest in a high end card, and it didn’t used to (or at least not so soon).

          • Petar Posavec

            I’m not sure if the performance was crippled on older GPU’s.
            I think I saw a review on this issue and performance seemed to have remained the same.
            However, I don’t own a 780ti gpu, so I can’t relay from personal experience.
            If performance did in actuality drop by going to newest drivers… then simply stick to the older ones.
            Nvidia simply decides to stop supporting their older hardware when they deem its necessary, and this might reflect in bad driver support which might be more focused towards newer GPU’s.

            Aside from that, I do think AMD vastly improved vs Nvidia in the driver department. They are bringing more updates faster and are better prepared for new game releases without necessarily breaking support for older GPU’s.
            However, I suspect that even AMD’s newest drivers wouldn’t be of too much use to older GPU’s.
            But if a person has 7970 GHZ edition, they could flash the bios to read it as a 280 X
            Most of those are rebrands, but I would suspect after flashing you would essentially ensure driver support for a while longer.

          • AndyP

            Good advice, but I did roll back the drivers and flashed the bios – worked for a while but it didn’t last (they updated the games and I was back to square one). I’ve got a long view of how they support cards, going back 20yrs, and it’s only recently that the support has been so time limited for high end cards (perhaps others). The 780 Ti was such a good GPU, I suspect too good, and they may have chosen to knock it off its perch. The drivers had a big impact on performance I’m sure as I saw it fall, then recover when I rolled back (at first) & that’s a clear temporal, intervention-related, cause and effect relationship. Technological advancement used to be good and profitable enough that it didn’t need underhand or indifferent tactics.

          • AndyP

            P.S. The later reviews showed better results/comparisons for newer cards as they used later drivers! Only the first reviews showed valid comparisons in this context.

      • AndyP

        Thanks for all the detail, but the point still stands. I’ve been buying graphics cards since the 3Dfx Voodoo (I’m no fanboy and want competition) but unfortunately I’m not overly impressed by this. I do, however, understand that optimisation will I hope help – though wonder what they were doing with the delay. I also want the underdog to compete with Nvidia monopoly as it’s for the benefit of all. I just wish it was the jump that, I’m afraid, Nvidia have made when releasing new architectures (though noting lots of caveats). I don’t want a card that’s great, but for limited titles. Let’s hope for significant optimisations and price cuts.

        • Petar Posavec

          To add to that, we need to keep in mind that AMD is a company with much smaller resources and the industry almost always provided developer/software optimizations for larger companies willing to pay them.
          AMD is also a corporation, but out of the offered options, they are an underdog and need our support to make better products.
          Having said that Vega has been (at least as much as I can remember) specifically targeted to compete against 1070 and 1080 in gaming… however, it also came with a lot of new features (as already mentioned) which need to be coded for.

          If I had a choice between AMD and Nvidia, I’d go with AMD because I know what to do to increase efficiency and performance out of their products to be on par or better than the competition.
          In fact, I’m getting myself an Asus ROG laptop in about a month with Ryzen 1700 and RX 580.
          Sweet 17″ laptop which is just what I’ve been eyeing for a while now.
          Once I undervolt the thing on the CPU and GPU, it will run even better/cooler.
          :-)

          • AndyP

            Agree. But still wish the much hyped card was better out of the gate. At least they’re putting some pressure on Nvidia. It may be impractical for you, but why not a desktop – more room, more cooling, more power, greater overclocking (and therefore longevity), more inputs/connections/device and new tech compatibility, upgradable and you can replace components easily if they fail? And you can’t leave it on a train (bad memory in two ways!)?

          • Petar Posavec

            Can’t argue with you there.
            AMD needs to improve efficiency out the door and stop with overvolting their GPU’s.
            They have good products and can have good efficiency too if they can fix this issue with voltages.
            People end up with negative first impressions of AMD because of early reviews, and even if we know we can make their products better in performance/efficiency manually… most people don’t care and will sooner see it as a way of ‘apologizing for AMD’ and ‘justifying their failures’.

            Btw… one cannot expect that AMD would have bested 1080ti out the door in gaming… Vega FE did beat out on it and Titan quadro’s however in pro software (science, etc) for a fraction of the price.
            But again, reviewers don’t particularly focus on the highlights… most test games and don’t consider to take into account overvolting as being the culprit behind large power draw (along with too high clocks which aren’t suite for the manuf. process).

            Oh and, correct me if I’m wrong, but 1080ti was released as a response to Vega.
            And Nvidia is a much larger company that can afford to do that… AMD can’t.
            I suspect the performance gap between Vega lineup and 1080ti will narrow once devs start optimizing for Vega’s features (that is… IF they do).

          • AndyP

            …and much more ‘bang for buck’ both in short and long-term.

          • AndyP

            …potentially massive, higher quality and many screens, more peripherals… I’ll shut up now!

  • Andrew Jakobs

    ” Only time will tell though, as the GTX 1080 has had more than a year for developers to work out all the best ways to squeeze every last drip of performance out of the card.”

    That’s the problem these days, the GPU’s aren’t pushed to their limits like in the early days where every secret was uncovered due to the longer times the cards were in use, and due to having to actually program on hardware level. These days it isn’t even really interesting anymore due to the ammount of GPU’s available to actually try and optimize for specific GPU’s, that’s the job of the GPU manufacturers, and they seem to abandon older GPU’s really fast (or fubar them with an update and never fix what they fubarred).
    That’s where the consoles still have a headsup if developers actually want to use the full power, on console it pays off to actually know the hardware as you know everyone will have the same configuration (save for some difference in SKU’s), that’s why a console with a lesser GPU can perform much better than a regular windows system with the same type of GPU (less overhead and more optimized code).

    Seeing this new card, I don’t know if you’re better of buying this one or a new GTX1080, especially with the power it sips and heat it spreads. But time will tell.

  • NooYawker

    It’s good to see AMD throwing their hat back in the ring.

  • But is is very hard to find RX Vega 64 and it is also overpriced.