back to article Nvidia heralds Steam for Linux debut with 'double-speed' drivers

The Linux version of Valve’s Steam games download shop goes live today, chip maker Nvidia has revealed. Nvidia said “Steam gaming platform that officially opened to gamers today” while announcing new Linux-optimised version of the R310 drivers for its GeForce graphics chips, including the new GTX 600 series. According to the …

COMMENTS

This topic is closed for new posts.
  1. Steven 1
    Thumb Up

    "Steam runs on Ubuntu 12"

    Meh its a start :)

    1. Lars Silver badge
      Thumb Up

      I hope it will run on other distros too.

      1. Bucky 2
        Alien

        Probably not, though

        In any enumeration, it's usually safe to assume items not listed are excluded.

  2. Silverburn
    Boffin

    Graphics @ double speed

    ...meh. So compared to windows, my games will be how many fps faster/smoother?

    1. Anonymous Coward
      Anonymous Coward

      Re: Graphics @ double speed

      Exactly the same if you only have a 60fps monitor and a good setup.

      Seriously, always hated it when people online are all "lololol my framerate is 212fps"

      "Yeah and your monitor is only 60 so whats the point?"

      1. Tom 7

        Re: Graphics @ double speed

        Or to put it another way a machine that wont run at 60fps under windows may now be able to run at that speed using linux.

        Until you run Unity on it...

      2. yossarianuk

        Re: Graphics @ double speed

        I play a game based on the quake 2 engine, digital paint - dplogin.com

        Unless I disable Sync/VBlank the game is unplayable - yes it sticks at 60fps but that make the game nearly impossible to play, its almost like there is a really slow lag on the controls.

        As soon as I disable that option I get fps 400 - 1000 fps, its not the fps though, its not possible to play (well) unless I disable it.

      3. Anonymous Coward
        Thumb Down

        Re: Graphics @ double speed

        Common misconception. Framerate in games means more than just refresh rate as in monitors. Higher FPS does indeed make a difference in gameplay, regardless of monitor, more or less depending on the game.

      4. Flawless101
        Meh

        Re: Graphics @ double speed

        There can be a noticeable difference in the reaction times of a games at different levels of FPS. Just because an animation won't look smoother over 60 FPS doesn't mean there is no benefit.

        1. P. Lee
          Go

          Re: Graphics @ double speed

          > over 60 FPS doesn't mean there is no benefit

          Indeed, the fps is for one particular game with a particular resolution and particular effects turned on - it isn't the speed of the card itself.

          More FPS is like buying a CPU that's faster than you need at the moment. It will handle stuff you can throw at it later allowing the fps to drop and still be reasonable. Higher-res games with more polygons being rendered and more real-time physics calculations rather than canned animations.

          If we have to abuse some CS fanbois' wallets to get there, so be it - its all for the greater good! :D

      5. Anonymous Coward
        Anonymous Coward

        Re: Graphics @ double speed

        FEAR MY CRT! I SHALL PWN YOU ALL!

      6. Tyrion
        Linux

        Re: Graphics @ double speed

        It's only the same (60fps) if you have vsync on, but that's besides the point. A higher fps on GNU/Linux indicates that the pipeline is more efficient, and consequently, the stresses on the hardware are lower, even if it's vsync'd to the screen's refresh rate. Personally, I'd prefer to increase the longevity of my computer as much as possible, so bring on the fps ;)

    2. yossarianuk
      Linux

      Re: Graphics @ double speed

      Valve already stated their engine was faster in Linux than Windows

      http://blogs.valvesoftware.com/linux/faster-zombies/

      But speed really depends what desktop you using and if it is composited or not.

      I.e games in E17, LXDE are about 50% faster than in Unity/KDE (default settings)

      I imagine all the driver is really doing is just forcing the desktop to use the sensible settings in the first place.

      Here is a previous post I had made about this very subject (for Nvidia users):-

      ---------------------------------------------------------------

      Right now in most Linux distros 'out the box' Nvidia users has to do 2 things or games with be slow, have a lag that makes games unplayable, have graphic anomalies

      I really can't tell you enough how much this makes the difference to gaming... Without do these options many games are unplayable in Linux.

      (1) Enable 'unredirect fullscreen windows' in CCSM (compiz setting manager) / enable 'suspend desktop effects on fullscreen apps' for KDE

      - Without this various games just don't display correctly.

      Here is one good example..

      The Unigine heaven 3.0 benchmark

      http://unigine.com/products/heaven/download/

      If you run this out the box in unity its not only 1/2 the speed it should be the graphics are not actually correct.

      However if I enable 'unredirect fullscreen windows' then the speed is normal.

      (2) disable 'sync to vblank' (openGL settings) with nvidia-settings (this is really important or all games appear laggy)

      - Now some people moan that doing this creates tearing... All I can say is since Driver version 302.17 games are **not playable** unless I disable that option ... - I don't mean lack of FPS, its like all controls are laggy, fps's are hard to aim, etc.

      Desktop Linux needs to make these options **the default** to vastly improve Linux as a gaming platform for everybody.

      Basically I urge **any** openGL gamer that plays fullscreen games to try the options (you can always switch back) as it makes Linux being crap for gaming -> -> Better than Windows7.

      I bet no one that is running a Nvidia card will say doing these changes hasn't improved gaming on their system.

      ---------------------------------------------------------------

      1. Anonymous Coward
        Anonymous Coward

        Re: Graphics @ double speed

        As a relative linux noob I have to ask, how simple is it to replace the unity interface on Ubuntu with a lightweight alternative such as LXDE.

        I ask simply because I wanted to attempt making a pure gaming setup out of ubuntu (if the games I have on steam work) strip out anything I don't need, and just have a basic windowing manager and steam with the bare essentials.

        Is something like that possible? And if so is it fairly simple to do? Or is it one of those "wait for the experts to make a stripped down Steam distro, it'd probably be quicker and less painful for you" situations.

        1. Anonymous Coward
          Anonymous Coward

          Re: Graphics @ double speed

          As I understand it, on Ubuntu, go to a Terminal, type "sudo apt-get install lxde", enter your account password and wait for it to finish installing.

          Then, log off, and at the login screen choose "LXDE" as your session (I don't know where on the screen the option for this is off the top of my head). It should log you into an LXDE session instead of Unity, and it should remember to do that next time you log in as well.

          There may be the odd other tweak that you can do to make LXDE more user-friendly, or add some graphical admin tools for it and stuff, but basically it's as simple as that.

          Anon because I'm at work.

        2. Uncle Slacky Silver badge
          Linux

          Re: Graphics @ double speed

          You could just install Lubuntu instead, or if you want to keep the full-fat Unity option, install standard Ubuntu then "sudo apt-get lubuntu-desktop" and you've got (nicely-configured) LXDE as per the previous reply.

          Personally I use Bodhi, which uses Enlightenment (E17) for its desktop, and it's even lighter than LXDE. It still retains the Ubuntu base underneath, however, so (presumably) Steam will still work OK on it.

          1. DiBosco
            Unhappy

            Re: Graphics @ double speed

            I don't think so, I thought I saw on the Valve site that it only works on Ububtu running Unity.

        3. YellowApple

          Re: Graphics @ double speed

          You can actually get Ubuntu with LXDE pre-installed *instead* of Unity (it's called "Lubuntu"). Or, if you somehow do like Unity, you can install it via "sudo apt-get install lubuntu-desktop" (a bit cleaner than just installing LXDE).

          Lubuntu comes with Chromium (the open-source project behind Chrome), which is a plus for me.

        4. Euripides Pants

          Re: noob question

          You could download Lubuntu or Xubuntu.

        5. Tyrion
          Linux

          Re: Graphics @ double speed

          It's probably easier to just download one of the many Ubuntu variants with it built-in if you're a newbie:

          http://wiki.lxde.org/en/Ubuntu

      2. Fibbles

        Re: Graphics @ double speed

        As noted above, most Linux distros are not currently set up for gaming. I'm currently running xubuntu which has low system requirements and should be ideal for gaming. Unfortunately the default settings make it nearly useless.

        I have no idea what's going on with the XFCE compositor (xfwm4 I think?) but it can reduce the fps in games by 50% and creates massive amounts of screen tearing. It's enabled by default for a few desktop effects (window shadows etc.) and it really isn't obvious to a new user that this seemingly unrelated feature could be causing problems with their games.

        You can turn it off by going to Settings Manager > Window Manager Tweaks > Compositor.

        I'm at a loss though as to what the hell the compositor is actually doing to cause such a degradation in performance. I run my games fullscreen, I can't even see the desktop whilst playing, the compositor shouldn't be doing anything at this point.

        1. Tom 7

          Re: Graphics @ double speed

          In my experience the three machines I have with Ubuntu and NVidia on that have been upgraded past 11.10 will no longer run opengl programs anywhere near as well (if at all) even if I move over to LXDE/XFCE or whatever

  3. The BigYin

    Question

    If I have an LCD screen which claims to be running at 60Hz, is there any benefit to my GPU firing out data at over 60fps? Won't I just see tearing? Or do LCDs not work like that?

    1. Lee Dowling Silver badge

      Re: Question

      Refresh rate is the bottleneck that stops anything "updated" behind the card showing on the screen. So if your refresh rate is 60, you could have a supercomputer on the back end and will never see any difference.

      People like to argue, like the whole "vinyl sounds better" or "oxygen-free gold cables" or whatever crowd, but that's the basic physics of it.

      And, even then, 30fps is really not that different from 60fps to your eyes. Most movies and big-name videogames *weren't* 60fps for years and nobody complained. Nowadays we have faster games, larger TV's, better contrast etc. so it can notice a little more but we also have games that routinely use triple-buffering, vsync, etc. to compensate.

      Basically, if you can't *tell* me the fps of a game within 10fps without having to actually bring up the display option in-game to show it, then you would never have noticed the difference anyway.

      But the point is, a game that gets 212fps on a monster of a machine will get 60fps on a machine that was never capable of it originally. That's the point of the improvement - to suck in gamers who care about fps (even if they can't tell the difference), and those people on lower hardware (like the various Linux-based consoles that are in the works). If you can get games to run faster on Linux, you can save money on hardware by using Linux and those games instead, and thus boost Linux's credentials.

      Actually, though, the only difference is having a tight integration between driver developer, operating system programmer and games programmer. If that tight integration existed on ANY combination, you could get improvements, and that's always been true. But it's no bad thing for Linux gaming to get the boost of not only having Steam available, but getting coding attention on its drivers, and convincing the gamer market that they run better on Linux. Game. Set. Match.

      1. This post has been deleted by its author

      2. Fibbles

        Re: Question

        There are a surprisingly high number of people in this thread talking out of their arse about how high frame rates are pointless (and getting upvoted for it...?!). Higher frame rates generally mean smoother control input, less buggy AI handling, smoother physics etc. It's not all about what your eye can see.

        1. The BigYin

          Re: Question

          I can't follow why a simple question got downvoted...but the explanation that the game engine pays attention to the frames coming through and thus a higher framerate (even above what the monitor actually can display) may give better/smoother response is interesting.

          Are there any authoritative sources on this?

          1. Fibbles

            Re: Question

            If you're talking about a scientific study, probably not. The internet is lettered with articles on the subject though, some good, some bad. A simple google of terms such as 'higher fps smoother controls' will give you thousands to read, though I'd stick to more respected sites like tomshardware and such.

            As a quick example though; imagine you're playing a first person shooter. You have it installed on two different rigs, one capable of a constant 60 frames per second, the other a constant 150 frames per second*. The monitors attached to these computers have a refresh rate of 60Hz, so you're only ever going to 'see' 60 frames per second regardless of which rig you use. However, on the rig that can run at a max of 60 frames per second, your control input as well as things like physics calculations are are only updated 60 times per second, therefore the rig that can run at 150fps is sampling your control input far more often (making it more accurate which gives it the feeling of being less laggy).

            *This is unrealistic since frames per second is an average and 60fps could actually mean the equivalent of 70fps in low complexity scenes and 15fps in high complexity scenes. Generally your brain will perceive motion as fluid if you're achieving 25fps or greater, however motion will stop appearing fluid if your brain perceives a drop in frames per second (which it will since fps is not constant). What's more, a drop from 45fps to 30fps is considerably more noticeable than a drop from 200fps to 100fps.

            1. Fibbles

              ^littered

              What I'd give for an edit button...

            2. Bucky 2
              Pint

              Re: Question

              What I'm hearing here seems similar to the DPI discussions we used to have with print designers when we wanted an image for the web.

              Saying that you have 5000 fps (or some such) is a lot like saying that you have 5000 dpi, except the multiplier is a bit more abstract, and comes in terms of complexity of the image, rather than the more direct math of inches of image, like we have with DPI.

              It's all good. Sometimes it IS a waste to have too many DPI, and sometimes it IS a waste to run at a high frame rate.

    2. Flocke Kroes Silver badge

      Tripple buffering

      Imagine you graphics card is sending frame buffer 1 to the display. While it is doing that, it starts drawing the next frame on buffer 2. If your card can average 600 frames per second, it will complete that task well before frame 1 has been sent. It could sit idle and save some electricity, or it could start drawing a new frame in buffer 3. When the card has finished drawing in frame 3, it still has plenty of time, an no real use for the data in frame buffer 2. It can start drawing another frame there. Eventually, all the data from buffer 1 will reach the monitor. When that happens, the graphics card can start sending data from frame buffer 2 or 3 - which ever is complete. If buffer 2 was complete, the card finishes drawing in buffer 3, then starts drawing in 1, then 3, and so on until buffer 2 has been sent to the monitor.

      Triple buffering has many wonderful advantages:

      *) It provides extra profits for electricity companies.

      *) It makes you fan spin so loud that the neighbours can hear it over the noise of exploding zombies.

      *) You can boast about how many frames per second you £5000 graphics card does.

      *) The gaming engine looks at your controls once per graphics card frame rather than once per monitor frame.

      That last one can make a real difference to how laggy a game feels.

      1. ArmanX

        Re: Triple buffering

        That last one is the real reason for higher framerates: if your graphics card does 60 fps, same as your monitor, then the graphics will look fine, as long as it can keep up. However, many games alternate between drawing a frame and checking controls, doing math, and generally keeping track of everything. If your computer is working hard on getting the fps up to 60, it's not taking as much time as it needs to do number crunching and control checking, and the game will feel laggy, often to the point of being unplayable. In my experience, if the fps is somewhere around 2-3 times what your monitor can display (that is, between 100 and 200 fps), games are a lot smoother - not because the graphics isn't lagging, but because all the stuff that happens between frames has enough processor to get the job done.

        As an aside, I tend to play most games in a new X window; it has the benefit of always being the right screen size, it disables graphics processing on my desktop, and it allows swapping between the game and the desktop (alt+F7 for the desktop, alt+F8 for the game). Plus, if the game crashes, I can always kill it from my desktop. Very handy.

      2. Anonymous Coward
        Anonymous Coward

        Re: Tripple buffering

        iuno, we did double buffering at uni only not triple. But there are quite a few methods that allow you to handle the buffering seperately from the control input. I can't remember exactly what it was but something along the lines of a second thread.

        Thread 1: Constantly running taking commands and affecting gameplay

        Thread 2: Handling a variable framerate.

        That's the stupidly simplistic way of putting it, what I describe as "thread 2" was actually several tick timers running using a semaphore. It'd increment the tick timer and then put it to rest using the semaphore until the next frame was ready to post. then while the ticks were > 0 it'd do all the logic. I honestly can't remember how I did it though. But baically it used a tick timer to work out if the game was ready to post the next frame. Until the next frame was ready it just looped around handling the user input logic. Once the frame was ready it broke out of the game logic, drew the frame using double buffering, and then went back into the logic loop until the next frame was ready Simple way of pseudocoding it (THIS IS NOT ACCURATE PSUEDOCODE)

        while( gameRunning)

        {

        while( !nextFrameReady)

        {

        do logic();

        checkFrameReady()

        }

        drawFrame()

        }

        double buffering was handled in drawFrame. checkFrame would've handled calculating the tick timers. basically checking if the loop was cycling faster than the tick counter was incrementing or something.

        Bah I wish I remembered this stuff more clearly.

  4. RachelG

    not a gamer - but i always presumed the game-players fps figures were with respect to a benchmark example gameplay of some sort; and thus that if it could play *that* at X hundreds of fps, it could play newer stuff with far more detail at the monitor's actual refresh rate with ease.

    Is that actually so, or am I making the fallacy of assuming a logical explanation when it's probably as arbitrary and variable a unit as women's dress sizes?

    1. Anonymous Coward
      Anonymous Coward

      In a lot of cases yes having an FPS above 60 would probably mean you could just up the graphic quality. But in the case I mentioned earlier it was the idiots who already claim to have everything maxed on their quad GTX695 setup that will probably be replaced in a heartbeat when the GTX795 comes out.

      Honestly though I'm happy that Steam has moved to linux, I just hope that the games I enjoy move over with it alright. I would hope they'd have a compatibility list for the games you own. "You have game X in your library, this game has been ported to linux" or "You have game Y in your library, this game can run on most linux setups" etc etc.

  5. TJ1
    WTF?

    The statement seems to imply the drivers were previously hobbled

    Trying to understand what Nvidia mean here. Performance gains are usually incremental when gained through driver optimisation. Do they mean they found some hitherto unknown bottlenecks in the Linux drivers or have they simply removed an artificial cap in the drivers that prevented the Windows drivers looking bad on the same hardware?

    "According to the chip maker, the drivers “double the performance and dramatically reduce game loading times” of Linux games - at least if a test comparing the new code with version 304.51 while running Valve’s Left 4 Dead 2 beta is anything to go by."

    1. Lee Dowling Silver badge

      Re: The statement seems to imply the drivers were previously hobbled

      If you know that operation X is slower on Linux than on Windows (or vice versa), then you can optimise by taking account of that.

      It doesn't mean Linux or Windows is any slower than the other, it just means that they weren't optimising to all platforms when they wrote not only the drivers but also the games.

      Say you shove a thousand models into memory and then try to use them. Maybe the Linux architecture / driver prefers them to be byte-aligned, or from a certain area of memory, or in a certain format, or below a certain size and if that's not met it kicks in some routine to put them into an optimal arrangement that can take some time (but is invisible to the user). But other platforms might have different requirements. Thus designing, programming and testing games only on Windows means that you optimise the paths you can SEE are the fastest. And then when you move that code to Linux, it slows to a crawl despite being THE SAME CODE.

      This is more about choosing the right optimisations / settings that worst best for Linux rather than Windows (e.g. Windows might well require you to jump through hoops - which your code does - whereas Linux is quite happy to give you complete 64-bit memory access and not worry about how you align it, etc.). And these games were all made for Windows originally, so they squeeze the most out of the Windows routines. This is just a matter of finding out where Linux can do better by *NOT* pretending to be the same as Windows and telling the game (code-wide): "Don't be silly, I don't need you to babysit me as if I were Windows, just give me the damn data".

    2. yossarianuk

      Re: The statement seems to imply the drivers were previously hobbled

      I'm almost 100% sure that the drivers will just enable the 'correct' settings for the desktop (that was previously cutting 50% of fps for the average user (see previous post)).

      i.e the drivers were not hobbled, just mad insane default desktop settings in kde,gnome3,unity thanks to bad ati drivers the settings are not the default.

      i.e - enable 'undirect fullscreen windows'

      i'm pretty sure people who used the 'correct' settings anyway will notice no speed up (as we were already running 50% faster..)

    3. JEDIDIAH
      Linux

      Re: The statement seems to imply the drivers were previously hobbled

      Valve just became the QA department for Nvidia. They are pushing those drivers harder than anyone has ever pushed them and they probably can submit very useful bug reports.

      Yeah, there were probable a few bugs to squash in the nvidia BLOB driver for Linux.

  6. Anonymous Coward
    Anonymous Coward

    “Steam gaming platform that officially opened to gamers today”

    Yes, but I'm assuming that date is in Valve time and so is actually going to be next quarter ...

  7. Alex Walsh

    FPS- missing the point

    Is something runs at 200fps that's good because when things gets very busy onscreen, it'll drop. If it drops 75%, that's still 50fps- try waiting for the screen to get busy when you're only managing 50fps to start with... it'll turn into a slide show!

  8. Bakunin
    Linux

    Double the performance

    That's great Nvidia. But any chance you could get something as simple as KMS working with your drivers?

  9. Cholo

    From the download page, the only performance-related change they mention is:

    "Fixed a performance issue with recent Linux kernels when allocating and freeing system memory."

    I wonder if this was enough to double performance?

    http://www.nvidia.co.uk/object/linux-display-ia32-304.64-driver-uk.html

    1. This post has been deleted by its author

    2. Fibbles

      I had originally thought those were old drivers but it turns out you are correct and 306.64 are the latest certified drivers since they were released today. I'm not 100% sure how nvidia's version numbering system works to be honest, but I'm beginning to suspect this reg article is about the 310.14 beta drivers which were released on the 15th of October unless their changes have already been incorporated into 306.64.

      From 310.14's release notes:

      - Implemented workarounds for two Adobe Flash bugs by applying libvdpau commit ca9e637c61e80145f0625a590c91429db67d0a40 to the version of libvdpau shipped with the NVIDIA driver.

      - Fixed an issue which affected the performance of moving windows of VDPAU applications when run in some composite managers.

      - Added unofficial GLX protocol support (i.e., for GLX indirect rendering) for the GL_ARB_pixel_buffer_object OpenGL extension.

      - Added support for HDMI 3D Stereo with Quadro Kepler and later GPUs. See the documentation for the "Stereo" X configuration option in the README for details.

      - Added experimental support for OpenGL threaded optimizations, available through the __GL_THREADED_OPTIMIZATIONS environment variable. For more information, please refer to the "Threaded Optimizations" section in chapter "Specifying OpenGL Environment Variable Settings" of the README.

      - Improved performance and responsiveness of windowed OpenGL applications running inside a Unity session.

      - Added support for OpenGL 4.3.

      - Added support for the "Backlight" RandR output property for configuring the brightness of some notebook internal panels.

      - Fixed a bug that prevented the Ubuntu Unity launcher panel from unhiding: https://bugs.launchpad.net/unity/+bug/1057000

      - Fixed a bug that caused nvidia-installer to sometimes attempt to write a log file in a nonexistent directory.

      - Fixed a bug that caused incorrect input transformation after resizing an NVIDIA X screen with xserver ABI 12 (xorg-server 1.12) or newer.

      - Fixed a bug that caused GLX to leak memory when Xinerama is enabled.

  10. Anonymous Coward
    Anonymous Coward

    Here's how to think of frame rates

    OK, here's how to think of frame rates.

    The speed limit on the highway is 60 MPH (your monitor is 60 Hz).

    Assume strict law enforcement (sync to vertical refresh on).

    Consider a Yugo (low powered system) and a Jaguar (high powered system).

    On flat ground and no headwind (simple scene, minimal polygons and textures) both are the same speed - 60 MPH.

    Now, we hit the Rockies, and get up to 10000 feet, on a 15% grade (complex scene with lots of polygons and textures). The Yugo is making 20 MPH, the Jag is still running 60.

    Now, what if you don't happen to live by the Rockies, but you want to compare the two cars? You find a nearby stretch of road with no speed limit or no enforcement (sync to refresh OFF), and you see how fast you can go. The Yugo goes 60, the Jag goes 160.

    1. AceRimmer
      Headmaster

      Yugo?

      only 40 examples left on Britain's roads!

    2. Anonymous Coward
      Anonymous Coward

      Re: Here's how to think of frame rates

      No one has a Yugo

      1. JEDIDIAH
        Linux

        Re: Here's how to think of frame rates

        > No one has a Yugo

        Anyone that's stuck with an Intel GPU has the gaming equivalent of a Yugo.

    3. YellowApple

      Re: Here's how to think of frame rates

      Nice analogy.

  11. Zmodem

    have you tried militia

    http://www.moddb.com/mods/q43a

  12. Anonymous Coward
    Anonymous Coward

    The endless tech drivel

    Buy a Mac and be done with it.

    1. Zmodem

      Re: The endless tech drivel

      everytime your in front of a mac its like your face to face with a massive cock with a sweaty sack

      1. Sorry that handle is already taken. Silver badge

        Re: Zmodem

        To be fair, 50% of the population might enjoy such a thing.

        1. Zmodem

          Re: Zmodem

          maybe, but gamers enjoy their forum troll pictures and overclocking 3dmark scores

    2. TheRealRoland

      Re: The endless tech drivel

      I'm sure Martha Stewart's website has the latest on christmas table decoration. Sounds like a better match for you, somehow.

      (Or, if you're commenting from the other side of the pond, I'm sure Delia Smith has some new ideas on her website)

    3. Bush_rat
      Facepalm

      Re: The endless tech drivel

      Sure, I'll spend $2500 on a top of the line iMac, only to render it useless and unupgradeable 4 years later. Don't believe me, look at the 2008 iMacs, my one is barely useable.

      1. Trygve
        Unhappy

        @Bush_rat

        to be fair to the fruit loop, last time I tried upgrading a PC which had been state-of-the-art four years previously, I was forced to replace the memory, cpu, motherboard, graphics card and power supply for compatibility reasons. The hard disk was so slow and old it got repurposed as a dump drive tertiary to the new SSD and new terabyte drive, and I got a new monitor to make the most of the better graphics capability. Then I got a new nicer case to put it all in since it was only another £40.

        All in all I managed to re-use my power cord, network cable, DVD-RW, mouse and keyboard, plus get some more mileage out of a hard drive.

        1. ArmanX

          Re: @Trygve

          I've had a similar experience, upgrading a PC - but the difference is cost, flat out. I upgraded motherboard, RAM, CPU, heatsink, and power supply for just under $400 - kept the case, hard drive, optical drive, and graphics card (I'd upgraded it fairly recently). In fact, the last few upgrades I've done have come out right around $400 - about one upgrade every three or four years.

          The equivalent fruit-based computer (a Mac Pro) would cost $3573, according to the selection guide. That means I saved more than $3000 just by not owning a Mac. I could have gotten a computer and a car, instead of just a computer.

          For added fun, take a look at the Mac Pro "Configure" page - $100 for a CD/DVD drive (normally $20)? $1000 for a 512GB SSD instead of $400? At those prices, I'm sticking with my PC...

  13. Anonymous Coward
    WTF?

    Framerate

    If properly designed, the physics world clock rate should be independent of framerate, and framerate should be tied to vsync.

    i.e. Take input all the time, process the physics as best your time slot allows, then when you get an event from the monitor saying it is about to draw another frame, flip the buffers and render another frame to the back buffer, then go back to computing the physics. You could slightly improve this technique by keeping track of how long it takes to render a frame (interval X), then wait until vsync-X to start your render.

    The point being, you could use all that time you spent rendering dropped frames to work on physics, and you'd have a better overall user experience.

    I have no idea whether existing games are optimized along these lines, so maybe FPS does matter, because they aren't programmed right.

    1. Anonymous Coward
      Anonymous Coward

      Re: Framerate

      The engine has no way of knowing how complex an up-coming scene is going to be. Therefore it has no idea how long a frame will take to render until it starts rendering it. This means each frame has to be rendered ASAP rather than being scheduled. The engine then sends the most recently rendered frame to the monitor when it is capable of displaying a new frame (since it's only refreshing 60 times a second), dropping any older frames.

      You can enable vsync which means only 60 frames will be rendered per second in lock-step with the monitor but this means by the time each frame is displayed the player input that it is representing is already 16.6ms old. This isn't so noticeable in things like real time strategy games but in first person shooters it makes the controls feel sluggish.

      1. Anonymous Coward
        Thumb Up

        Re: Framerate

        Thanks AC, makes sense.

        Can triple buffering achieve the best result then? e.g: the back buffer is always "dirty" (being painted), when the paint is complete it swaps to the middle buffer, then when vsync hits, it gets swapped to front?

        This way frames could be rendered as often as possible, but tearing could still be eliminated. Is it only with double buffering that the whole process has to slow down to 60fps? It seems like if you draw to the monitor more than 60fps, the result would be a not-very-useful situation where different vertical slices of several different frames are displayed to the user. Does it help play-ability to have your head being 15ms old, your torso being 10ms old, and your feet being 5ms old?

This topic is closed for new posts.

Other stories you might like