Virtu MVP 2.0+Hyperformance / GPUintegrated program

rakzrakz Join Date: 2012-10-29 Member: 164315Members
This works? Because in bf3 i know there is some increase in performance
People with boards z77 have it free, i'll change my i5 2500k to 3570k already bought(crazy deal tough)
HD3000 will be HD4000, and since them i think virtu mvp will have more difference in fps


U welcome to talk about

Comments

  • DC_DarklingDC_Darkling Join Date: 2003-07-10 Member: 18068Members, Constellation, Squad Five Blue, Squad Five Silver
    I am unaware what this thing is, you are talking about.
  • CurveCurve Join Date: 2003-12-17 Member: 24475Members, Reinforced - Shadow
    I am unaware what this thing is, you are talking about.

  • ScardyBobScardyBob ScardyBob Join Date: 2009-11-25 Member: 69528Forum Admins, Forum Moderators, NS2 Playtester, Squad Five Blue, Reinforced - Shadow, WC 2013 - Shadow
    Sounds like he's referring to this: http://www.lucidlogix.com/technology-hyperformance.html

    Also, info here: http://www.tweaktown.com/articles/4651/lucid_virtu_mvp_hyperformance_tested_with_asrock_z77_and_intel_ivy_bridge/index3.html

    Mostly sounds like a way to use both your Integrated and Dedicated graphics while gaming. I know AMD is going this route a bit with the Dual Graphics options with its APUs (e.g. you can run a dedicated GPU in crossfire with the APU's integrated graphics).

    For NS2, unless your GPU limited, I don't really see it increasing your fps.
  • DC_DarklingDC_Darkling Join Date: 2003-07-10 Member: 18068Members, Constellation, Squad Five Blue, Squad Five Silver
    ut talking to some people the best way to explain it is that Virtu MVP has the ability to make use of unused CPU cycles to help increase overall video performance.
    Almost noone in the ns2 community has unused CPU cycles as NS2 is HUGE on the cpu usage. If someone does, there graphics card compared to there cpu must be pretty much garbage. hehe

    I also remember reading about intel planning something similar a good while back, but do not quote me on that. My memory tends to be crap.

    Another problem with using 2 GPU chips is present in using stuff like SLI, vidcards with 2 chips, and most likely also this Virtu thing.
    Its that 2 chips need to communicate together. This does produce some latency and it has been known to case microstutters. Both amd and nvidia do there best to fix it surely, but I would expect more microstutters.

    So no, I do not think it will work in NS2 specificly, and I would keep testing in other games to see if for that game, it makes a difference.
  • buhehebuhehe Join Date: 2012-05-15 Member: 152140Members
    edited March 2013
    ut talking to some people the best way to explain it is that Virtu MVP has the ability to make use of unused CPU cycles to help increase overall video performance.
    Almost noone in the ns2 community has unused CPU cycles as NS2 is HUGE on the cpu usage. If someone does, there graphics card compared to there cpu must be pretty much garbage. hehe

    I also remember reading about intel planning something similar a good while back, but do not quote me on that. My memory tends to be crap.

    Another problem with using 2 GPU chips is present in using stuff like SLI, vidcards with 2 chips, and most likely also this Virtu thing.
    Its that 2 chips need to communicate together. This does produce some latency and it has been known to case microstutters. Both amd and nvidia do there best to fix it surely, but I would expect more microstutters.

    So no, I do not think it will work in NS2 specificly, and I would keep testing in other games to see if for that game, it makes a difference.

    Well, my GPU (6950 2gb) quite often limits my CPU (2500k 4,3ghz) when i'm doing 50-80 fps.
    But when i'm getting down to 40 fps i'm CPU bound.


    @rakz
    2500k to 3570k is a sidegrade, what's the point?

  • revo_phxrevo_phx Join Date: 2010-10-27 Member: 74626Members
    @rakz
    2500k to 3570k is a sidegrade, what's the point?

    Dont know that either... strange upgrade tbh
  • ScardyBobScardyBob ScardyBob Join Date: 2009-11-25 Member: 69528Forum Admins, Forum Moderators, NS2 Playtester, Squad Five Blue, Reinforced - Shadow, WC 2013 - Shadow
    buhehe wrote: »
    Well, my GPU (6950 2gb) quite often limits my CPU (2500k 4,3ghz) when i'm doing 50-80 fps.
    But when i'm getting down to 40 fps i'm CPU bound.
    I get the same thing on my 2600k @4.7Ghz + HD 6950, but only when I crank up my graphic settings (e.g. on full and in combat, I'll get waiting for GPU in the 10+ms range). However, our hardware settings is not that common (like putting a Ferrari engine in Kia), so most people are almost always CPU bound.
  • DC_DarklingDC_Darkling Join Date: 2003-07-10 Member: 18068Members, Constellation, Squad Five Blue, Squad Five Silver
    My hardware config is far from common either, but lets not forget a detail.

    When I go on max graphics, I am GPU limited.
    When I go a bit below it I am CPU limited.
    This only shows my limits are fairly close together, but its not a sign that the limit is gone. NS2s limit on the cpu is still a lot more rough then many game.
  • SandrockSandrock Join Date: 2002-12-16 Member: 10905Members, Constellation, Reinforced - Shadow
    ScardyBob wrote: »
    buhehe wrote: »
    Well, my GPU (6950 2gb) quite often limits my CPU (2500k 4,3ghz) when i'm doing 50-80 fps.
    But when i'm getting down to 40 fps i'm CPU bound.
    I get the same thing on my 2600k @4.7Ghz + HD 6950, but only when I crank up my graphic settings (e.g. on full and in combat, I'll get waiting for GPU in the 10+ms range). However, our hardware settings is not that common (like putting a Ferrari engine in Kia), so most people are almost always CPU bound.
    I'm basically the same 2600k @ 4.4ghz + 6970. I've tried using the Virtu feature of my mobo in the past, didn't notice much of a performance increase, and it didn't work well with lots of games. L4D2 would default to using only the integrated when I had Virtu enabled.

  • BVKnightBVKnight Join Date: 2012-02-26 Member: 147496Members
    Full disclaimer, I'm speaking only from the bit of research I did when I built my system. Anandtech has a great explanation of how Virtu works, but it's very technical. Check there to confirm anything I say.
    talking to some people the best way to explain it is that Virtu MVP has the ability to make use of unused CPU cycles to help increase overall video performance.
    Another problem with using 2 GPU chips is present in using stuff like SLI, vidcards with 2 chips, and most likely also this Virtu thing.
    Its that 2 chips need to communicate together. This does produce some latency and it has been known to case microstutters. Both amd and nvidia do there best to fix it surely, but I would expect more microstutters.

    Virtu works by using the IGP (integrated graphics processor) that is a part of a lot of modern CPUs. Intel's HDxxxx series, and AMD's Llano and Trinity APUs, are all processors with an IGP. So yes in a sense you are using unused CPU cycles, but the IGP is a separate component from the processor and doesn't take up any of its computing power. There is a slight overhead from the software layer (Virtu) interfacing with the hardware, but it's only a few fps.

    The Virtu software afaik doesn't cause any microstutters, in fact it is somewhat designed to combat them.
    buhehe wrote: »
    2500k to 3570k is a sidegrade, what's the point?

    The 3570k is smaller die-scale than the 2500k. This produces more heat, but also makes it more efficient. At the same Ghz, the 3570k gives you about 15% more performance than a 2500k. However, you can negate this by the 2500k being able to overclock higher before heat threshold. But more importantly, the 3570k comes with a better IGP (~2x the performance) than the 2500k, which means that for people planning to make use of it that the 3570k was a logical choice.
    Sandrock wrote: »
    I've tried using the Virtu feature of my mobo in the past, didn't notice much of a performance increase, and it didn't work well with lots of games. L4D2 would default to using only the integrated when I had Virtu enabled.


    The Virtu software does 2 things: Virtual Vsync and HyperPerformance.

    With VVsync, it uses the processing power of your IGP to offload certain frame rendering jobs, related to vsync and vsync buffer. The end result is that it removes the artificial FPS cap for using Vsync in-game (e.g. 30fps, 60fps, 120fps, 240fps). With VVsync, you can have vsynced graphics at a higher framerate than the traditional fps caps, if your graphics card is capable of pushing that framerate. It also helps smooth out vsync when double- or triple-buffering, in my own experience. It does NOT simply combine your IGP with your discrete graphics card in SLI/Crossfire for increased FPS. I don't think it increases your base fps at all, from what I can tell. It only processes vsync.

    When you have vsync enabled, it can create input lag when playing your games. This has to do with the way your card is buffering and then displaying frames, causing your input to be delayed from what is happening in-game. HyperPerformance is designed to eliminate this input lag, and make games more responsive for people who are playing with vsync. However, because of how it works, it changes the fps reading in any measuring software (in-game or otherwise). With HyperPerformance enabled, your fps rating is no longer your frames-per-second, it is a measure of how responsive your game is. Afaik, both r_stats and FRAPS are affected by this. You cannot trust any fps reading when you are playing with HyperPerformance enabled.

    Because Virtu is a software layer that tries to communicate between the IGP and discrete graphics card, it's compatibility is always going to be limited to how well certain games react to it. Virtu tests a lot of games and has profiles for them available in the software, or you are able to manually add them for unknown games (just like Crossfire/SLI profiles). Your results may vary.

    Using VirtuMVP with NS2 on my system does help a lot when I'm running vsync, for what it's worth. I'm still experimenting to see what vsync and graphics settings I want to run the game on in the long term.
  • DC_DarklingDC_Darkling Join Date: 2003-07-10 Member: 18068Members, Constellation, Squad Five Blue, Squad Five Silver
    That what is odd to me.
    Any time 2 different chips work together to produce your graphics it has a heightened chance to microstutter.
  • buhehebuhehe Join Date: 2012-05-15 Member: 152140Members
    ScardyBob wrote: »
    buhehe wrote: »
    Well, my GPU (6950 2gb) quite often limits my CPU (2500k 4,3ghz) when i'm doing 50-80 fps.
    But when i'm getting down to 40 fps i'm CPU bound.
    I get the same thing on my 2600k @4.7Ghz + HD 6950, but only when I crank up my graphic settings (e.g. on full and in combat, I'll get waiting for GPU in the 10+ms range). However, our hardware settings is not that common (like putting a Ferrari engine in Kia), so most people are almost always CPU bound.

    Yea, i have most settings cranked up.

    Btw I bought this setup more than 2 years ago, in February 2011... 6950 was the 2nd best card in the AMD line at the time, so it was a balanced rig :)
    The problem is that CPU-wise the market has stagnated, Intel wasn't pressured enough by AMD new CPUs, so there hasn't been a great improvement performance-wise, at least not single-threaded one.
  • ScardyBobScardyBob ScardyBob Join Date: 2009-11-25 Member: 69528Forum Admins, Forum Moderators, NS2 Playtester, Squad Five Blue, Reinforced - Shadow, WC 2013 - Shadow
    BVKnight wrote: »
    buhehe wrote: »
    2500k to 3570k is a sidegrade, what's the point?

    The 3570k is smaller die-scale than the 2500k. This produces more heat, but also makes it more efficient. At the same Ghz, the 3570k gives you about 15% more performance than a 2500k. However, you can negate this by the 2500k being able to overclock higher before heat threshold. But more importantly, the 3570k comes with a better IGP (~2x the performance) than the 2500k, which means that for people planning to make use of it that the 3570k was a logical choice..
    Would have made more sense to wait for Haswell, unless your planning on delidding that 3570k.
  • rakzrakz Join Date: 2012-10-29 Member: 164315Members
    i think i got a bad luck in my 2500k. i can 4.6ghz max with max temp in a high vcore only.
    planning to go 4.5 4.6 on ivy so i can get at least 10fps more. and for me 10fps for R$ 700,00(like 380$, prices in BR have too many taxes, high prices) is worth
    i do work i get money and i need moar fPsSSs
  • BVKnightBVKnight Join Date: 2012-02-26 Member: 147496Members
    edited March 2013
    If you're dead set on upgrading from a 2500k, you should at least check out when the latest generation equivalent is going to be released (Haswell). It will be newer than Ivy and more efficient, so there will be no reason not to get it instead of the Ivy if it's only a few weeks or months away.
    That what is odd to me.
    Any time 2 different chips work together to produce your graphics it has a heightened chance to microstutter.
    If I understand it correctly, this doesn't usually happen because the two different chips aren't working together in producing your graphics. Your discrete video card still does all of the rendering. There isn't an alternating rendering of frames like there is with dual video cards, which can cause microstutter. The Virtu software instead uses the IGP to offload certain tasks from the discrete card, making the discrete card more efficient in certain circumstances (i.e. when using Vsync). The only stuttering that can occur results from problems in the software, which is usually patched on a per-game basis; it is not hardware-based, like with crossfire/sli microstuttering.

    Check out this page from Anandtech for a good overview of how Virtu works:
    http://http://anandtech.com/show/5728/intel-z77-panther-point-chipset-and-motherboard-preview-asrock-asus-gigabyte-msi-ecs-and-biostar/3
    Read the previous page of the review for a more detailed explanation.
  • rakzrakz Join Date: 2012-10-29 Member: 164315Members
    will not be z77 board btw.
  • BVKnightBVKnight Join Date: 2012-02-26 Member: 147496Members
    If it's not going to be z77, then you should check what type of Virtu software comes on whatever board you are using. VirtuMVP runs on Z77 and has Virtual Vsync and Hyperperformance, older versions/chipsets didn't have all the same features.
  • digitaljuicedigitaljuice Join Date: 2003-01-17 Member: 12420Members
    I tried this feature out with my msi z77 mpower and didnt really notice much of a change. Im using a 7970 for my graphics and already get pretty good frames @ 1200p ended up turning the feature off as it caused issues with other things.
Sign In or Register to comment.