Welcome to Unreal Aussies!

We are a community of like minded gamers in the Oceanic region.

We play a wide variety of games and provide a fun, social atmosphere for all our members!

Input Delay and You

Discussion in 'Overwatch' started by Nakid, Nov 21, 2016.

  1. Nakid

    Nakid Intrepidus Dux
    Unreal Officer Streamer

    Joined:
    Feb 14, 2016
    Unreal Credits:
    5,612
    There are many factors that can affect your gameplay when it comes to games. Input delay is a big one, though it mostly applies to First Person Shooters where fast reactions are the difference between life and death. The lower your input delay, the better your competitive edge. There are many facets of the delay between an action happening and your reaction; some of them we can influence. I'll use Overwatch as my example though this could easily transfer to other games such as Counter Strike.

    Below I explain many of the facets of different delays, though feel free to skip to the Improving your Delay section.

    Explanation
    So what kind of delay are we talking about? Below is a list of the steps that are taken when you go to shoot an enemy.
    1. The enemy appears on screen
    2. The light reflecting off of the object into the retinas in your eyes
    3. Your brain receives this image and recognises the enemy contained in it
    4. You move your cursor over the enemy
    5. You click to fire your weapon
    6. Your action is sent to the server
    These steps can be diluted into different input delays.
    1. Server/Client delay and Monitor Response time/refresh rate
    2. Light hitting your eyes (Speed of Light)
    3. Image Recognition delay
    4. Reaction time, monitor response time/refresh rate and accuracy
    5. Reaction time
    6. Server/Client delay
    I'll endeavour to explain these and how you can improve on each of them in detail.

    1. Light and Image Recognition
    To put it in layman's terms, light enters your eye, the image is then filtered, sent to your optic nerve (transfers information to the brain) and then onward to your brain's Visual Cortex (processes images you see). You can read a more detailed explanation about how the eye works here; though I have provided an example of this process if we were to see Genji on screen and we wanted to shoot him (no one likes Genji!).

    eyeballGenji.jpg

    So now we have received the image in our brain. We can't really improve the Speed of Light right now (maybe when we have chips inserted in our brains to play games on?) - but can we improve our image recognition time? According to a 1980 study involving participants hitting a button when they see two light bulbs line up, the average response time of a human being is 180ms (0.18 seconds). The linked article also talks about a 1994 study which explains that by the time the brain has recognised an image, the object is no longer in that space in reality and our image is out of date by up to 100ms. The study then argues that the brain compensates for this by adjusting our out of date image with an updated one which dramatically reduces the delay due to our ability to recognise images.

    According to HumanBenchmark.com, the median response time for a human is 266ms (0.266 seconds) and the average is 277ms. This is a good representation of time taken for the entire process, but not necessarily focusing on human image recognition time due to the other delays I will explain below.

    The only way to improve this recognition is through repetition. We need to improve our ability to anticipate movement in the game (in our case, Genji flying through the air at you), which can only be done through experience.

    Summary: Delay of ~180-240ms, can be improved by playing more and anticipating movement.

    2. Server/Client Delay
    In any online game you will have delay between you, the server and the other players. This can vary due to the quality of the connection and the distance between you <-> Server <-> Other player. This means that, assuming a delay of 40ms between you/other player and the server, there would be a 80ms delay between that player moving and you seeing them move.

    In Overwatch we have a feature (some debate this) called Favour the Shooter (see link for my abridged explanation) which uses past information to try and reduce the feeling of this delay - this leads to discrepancies between what each player sees, but removes the necessity to lead your shots to account for delay.

    pharah.jpg

    The only way to improve this is by getting a better internet connection and living closer to the server. You then have to account for what kind of connection the OTHER person has. I generally have a delay of about 45ms to the Blizzard Sydney servers; I have a ADSL 1 connection and live out of town. An NBN connection can easily achieve single digit pings. Most Australian servers reside in Sydney as that is where the internet pipeline comes into the east coast of Australia; this provides a better connection to overseas connections.

    Summary: Delay of ~40ms (YMMV) between you and the server, can be improved by getting better internet and moving closer to server hubs.

    3. Monitor Response Time and Refresh Rate
    There are 2 factors which determine the quality of a monitor and how good it will be for gaming: Response Time (represented in ms) and Refresh Rate (represented in Hz - hertz).

    Refresh rate represents how many frames are displayed on your screen every second. Most monitors have a 60Hz refresh rate, though good PC monitors can go up to 120/144Hz. This means there is a 16.67ms/8.33ms delay between each frame on a 60/120Hz monitor respectively.

    Refresh-rate-comparison.png

    As long as our graphics card is churning out frames, it will put the latest frame up on screen. If a new frame isn't ready the last rendered frame will be displayed again. This is why you want an FPS (Frames Per Second) of 120+ compared with an FPS of 20. This is why it feels really jerky when you have a low FPS and you are looking around at a decent clip.

    We've established that the refresh rate tells us how often we have a frame rendered, but what is the response time for? Response time is how quickly each pixel changes from one colour to another. This is represented by a sliding scale of colour intensity from black to white (usually referred to as 0-100% grey). The diagram below displays what occurs on 2 different 60Hz monitors, one with a 8ms response time and another with 4ms. This means that the monitor is taking 1/2 and 1/4 of a single frame respectively to get the colours in the correct positions.

    Response-time-comparison.png

    Unfortunately a lot of monitor manufacturers misreport their monitors' response times (or they cherry pick the best values). DisplayLag.com has a list of monitors and has done consistent testing across all the models on their list. You can search for your model to get your actual response time. Blur Busters does a heap of testing on monitors and records all their tests. You can test how your monitor performs by checking out the UFO Test. Below are some of the test performed by them. You can see the colour bleed (which appears as a blur) on the lower Hz monitors. Obviously the faster the UFO is travelling, the more blur there will be.

    Pursuit-photographs.png

    There are a few technologies emerging for LCD panels which improve this blur, one of them being LightBoost. LightBoost is referred to by a few different names depending on brand, but basically involves strobing the backlight of the monitor to get a more CRT-like image (it switches between showing the image and black - removing the bleeding of colours between). You can read more about it here. Most of these features are only available on more recent high end monitors. You can see an example of this in action on our UFOs here.

    In many games you may have seen an option called VSync (Vertical Synchronisation). This option synchronises how many frames a GPU renders to the same amount of frames displayed on the screen (i.e the GPU renders 60 FPS and only 60 FPS for a 60Hz monitor - it creates a cap based on your monitor's refresh rate). This is useful if you are seeing screen tears which is the monitor showing you 2 frames at the same time (as it replaces each line of pixels in a top to bottom manner - the GPU is sending too many frames for the monitor to handle). VSync eliminates this problem as the monitor will wait to display a frame until it is ready on the GPU. Unfortunately there is a big downside to this: since the frame is being delayed, all your inputs are also delayed from showing on the screen. Another downside is that if your FPS drops below the "cap" (60 in our example), then you're going to notice a sudden jerk on your screen since the monitor had to wait even longer than usual for a fully rendered frame.

    maxresdefault.jpg

    NVidia released a technology called G-Sync with their GTX 600 range of GPUs which allows the GPU to take control of this process, rather than the monitor. Unfortunately your monitor needs to be equipped with the chip from NVidia to provide this functionality. This eliminates the potential staggered frames of VSync and provides a smooth viewing experience. You can see the different in the below video:



    Summary: There are many working parts in a monitor. More expensive monitors provide a better experience, reducing time between frames and clarity of image. Both of these help your eyes to recognise a target and use your input device to point your crosshair at them. It is harder to aim at a smudged image!

    Delay @
    60Hz - up to ~26ms = 16.67ms (Refresh) + 1-10+ms (Response - YMMV)
    120Hz - up to ~17ms = 8.33ms + 1-10+ms
    144Hz - up to ~16ms = 6.94ms + 1-10+ms
    NB: The refresh rate represents the input delay, while Response time merely adds the blur but depending on severity will still effect your image recognition. The summed figure above represents the time it would take to see the final image, response time will vary depending on the colour range between Frame 1 and Frame 2 and your individual monitor.

    4. Input Device
    The final piece of the input puzzle is your peripherals! On a PC this will generally be your mouse and keyboard. I won't focus on the keyboard aspect since generally you'll be holding down a button which produces a constant stream of data. It may be important in Overwatch if you were to use a life saving ability, but the vast majority of the time it is negligible.

    So how does a mouse work? A modern optical mouse works by shooting light from a light source (usually a LED), then detecting that light's movement on a surface. If your mouse is plugged into a USB port, by default it will "poll" your mouse every 8ms (or 125Hz). Some mice such as Logitech Gaming mice, will allow you to select your polling rate up to 1000Hz (1ms). If you already have a mouse like this, you can test it out by changing the polling rate and moving your mouse the same distance at the same speed across the screen.

    pollinginterval.jpg

    Having a polling rate which lines up with your monitor refresh rate is good because you will have a consistent movement pattern to receive the same result over and over (ie. mouse is precise). Above is a representation of each frame being rendered and timing of a mouse poll. It compares 125Hz and 1000Hz polling. The example shows that in a 125Hz poll, your movement might be a few ms out of date by the time the frame loads. Not only this, but the time at which it detects movement is inconsistent leading to you having inconsistent aim. The below image from BlurBuster's Mouse Guide shows this in action as stutters of the mouse cursor.

    mouse-125vs500vs1000-1024x570.jpg

    So we now know what makes a mouse precise, but how do we make sure we are accurate? As shown on the below diagram, we can see that accuracy and precision aren't necessarily the same thing. We've covered polling rate to improve our precision, but our accuracy lies in the speed at which are mouse moves our cursor across the screen. As you can see by the setup Overwatch pros use, they generally have a lower DPI (Dots Per Inch - how fast your mouse cursor moves) in the 400-800 range. You'll also notice that they have a polling rate of either 500Hz (2ms) or 1000Hz (1ms). I've found the best way to hone in on what the best DPI is for you is by recording yourself playing the game. If you're overshooting the mark, your DPI is too high. If you're not quite making it to the target, try increasing your DPI. Give yourself a chance to adjust to the new DPI - it may take a day or two.

    [​IMG]

    Summary: Increase your polling rate to reduce your input lag to 1-2ms. Work out which DPI works for you so you are both accurate and precise. You can check your current Mouse Polling rate here.

    How to Improve Your Input Delay in Overwatch
    Overwatch provides an easy way to check the time it takes to generate one tick worth of data. The best way to test this out is to go into a Practice match and hit CTRL + SHIFT + N. A fast moving graph will pop up which is measuring a heap of different variables. We are interested in the SIM values. You will see three there, one for low, average and high respectively.

    [​IMG]

    As you can see on the screenshot above, my SIM value spiked dramatically as I took this screenshot. You do not want this happening during play. You want an average that sits around 5-10, though if you can get below 5 that is fantastic. Spikes are not good as it interferes with the mouse precision we spoke about earlier. We want a consistent action to represent the same distance every time.

    We can reduce this SIM value in a number of ways (this is by no means an exhaustive list):
    • Set your graphics settings to LOW in the Overwatch settings
    • Enable in-game:
      • Full Screen Mode
      • Optional (can improve game looks but might add slight delay):
        • Textures to High
        • Models to High
        • Texture Filtering to High
        • Anti Aliasing to High
    • Disable in-game:
      • Dynamic Reflections
      • Local Reflections
      • Ambient Occlusion
      • VSync
      • Triple Buffering
      • Lock to Display
    • In NVidia Control Panel:
      • Go to Manage 3D Settings and set Maximum Pre-rendered Frames to 1
    • In Battle.net Settings:
      • Under General, change "When I Launch a Game" to "Exit Battle.net Completely"
    • Set Overwatch to have a High Priority value in Task Manager (can be done with a registry fix, see below)
    • Use the Windows Basic theme rather than Aero (resource hog)
    Paste the below into Notepad and save it as "overwatch.reg". Run it once and you should now be running Overwatch in high priority mode, always! You can change the exe name to do this for any other game such as Counter Strike as well.

    Code:
    Windows Registry Editor Version 5.00
    
    [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Image File Execution Options\Overwatch.exe\PerfOptions]
    "CpuPriorityClass"=dword:00000003
    
    [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Image File Execution Options\Battle.net.exe\PerfOptions]
    "CpuPriorityClass"=dword:00000005
    If you have any tips on reducing this input delay further or have any comments on my explanations, please post below! I hope you enjoyed this post, it has taken me a fair while to go through everything. :emoji_slight_smile:

     

    Attached Files:

    #1 Nakid, Nov 21, 2016
    Last edited: Nov 21, 2016
    • Like Like x 2
    • Informative Informative x 1
  2. FusSionzZ97

    FusSionzZ97 Overwatch Officer
    Division Leader Team Captain

    Joined:
    May 7, 2016
    Unreal Credits:
    1,139
    Good read Nakid.

    Regarding screen tearing which can be really horrible on low end monitors with certain games. Having a higher frequency monitor (120/144Hz) almost eliminates it which makes options like V-sync, G-sync, Free-sync not very viable (All sync options increase input lag as well). Either way, I recommend you always have these disabled unless you're playing some non competitive/singleplayer game with noticeable screen tearing.

    Also for graphical settings keep "Models" on low as it won't display certain bushes, plants, flags and whatnot around the map.

    Does my sim value check out?
    http://image.prntscr.com/image/5d78bf82c67d47d7b23fc3cf2f1ef77d.png
     
  3. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,237
    What's the data tick rate of overwatch?

    If that's 50, then wouldn't having a monitor and FPS set to the same with sync ON be as delay-reduced as possible? ie. The data received is shown at the first frame your PC is able to show it, and the time between the data change and the display change will be constant for every frame?
     
  4. FusSionzZ97

    FusSionzZ97 Overwatch Officer
    Division Leader Team Captain

    Joined:
    May 7, 2016
    Unreal Credits:
    1,139
    It's 60/20. 60 updates to the server and 20 from. With a framerate capped at 50 you'd only be sending 50 updates a second as well as less frames displayed on screen. Even if you are playing on a 60Hz monitor and you're achieving over 60 FPS enabling v-sync still delays frames to reduce screen tearing.
     
  5. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,237
    I used 50 as an example because I didn't know what it was. I looked it up and found they claim it's 63 unless it's scaled down because your internet can't keep up. here and here

    This means that if you're rendering frames at exactly 63Hz then each frame is based off the last tick and your game state is accurate.

    If you're rendering faster than 63Hz then some of your frames are extrapolated. Take 64Hz for example:

    tick @
    0- - - - - - - - - - - - - - -15.87- - - - - - - - - - - - - - -31.75- - - - - - - - - - - - - - -47.62
    |- - - - - - - - - - - - - - - - |- - - - - - - - - - - - - - - - - - |- - - - - - - - - - - - - - - - - - - |
    | - - - - - - - - - - - - - - - | - - - - - - - - - - - - - - - - - | - - - - - - - - - - - - - - - - - - -|
    0 - - - - - - - - - - - - - 15.63 - - - - - - - - - - - - - - 31.25 - - - - - - - - - - - - - - -46.88
    Frame @

    So frame 1 is at the first tick. But frame 2 is still based on information from the first tick, and the new data comes less than 1ms after it was rendered on your screen. Same happens for frame 3 and 4. So your picture of the game state for the duration of frame 2 is inaccurate to what your computer knows is ACTUALLY happening in the game. Your computer may be showing an extrapolated frame, so it LOOKS like it's smooth motion but at 15.87ms into the game, every player who was moving the mouse or has pressed another button in the last 15.87ms is doing something different to what it appears on your screen. At this moment, your computer is showing you something that it believed was happening at the 15.63ms mark, but your computer knows that it is no longer true.

    Since "sync" is only synchronising your graphics card to your monitor, then have them both sync'd to 63Hz. If your monitor is at 50Hz and your graphics card is at 63Hz then at some stage, the top half of your monitor is showing the PREVIOUS frame while the bottom half is showing the CURRENT frame. And if your graphics card is rendering faster than the tick rate, your monitor will sometimes show an accurate half and an inaccurate half.

    Does anybody know if this isn't true? I'd love to know how?
     
  6. FusSionzZ97

    FusSionzZ97 Overwatch Officer
    Division Leader Team Captain

    Joined:
    May 7, 2016
    Unreal Credits:
    1,139
    Wouldn't syncing your frame update rate with the games calculation rate introduce input delay similar to how v-sync does?

    We're only receiving 20 updates a second so syncing with the server would be a disaster. Plus games that run at lower tick rates like early Battlefield 4 ran at 10/30 (now 30/30 - 60/60) it's still always beneficial to be playing at higher framerates even if your monitor can't display them especially if you're running a 120/144/165hz monitors which I am. So on a game like overwatch i'm having 144 frames displayed to me (which is super beneficial) even though the games send/receive and calculation rate is nowhere near that I'm still getting a better experience than any other 60hz monitor user.

    I don't know, I think we need someone smarter to answer your question :emoji_slight_smile:
     
  7. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,237
    Internet lag might play a role actually. It would mean the ticks are never perfectly timed.
     
  8. Nakid

    Nakid Intrepidus Dux
    Unreal Officer Streamer

    Joined:
    Feb 14, 2016
    Unreal Credits:
    5,612
    Was going to mention this. There is no way to synchonrise it even if you had the same amount of server ticks/frames in the same second. That is the point of getting everything to poll so quickly that even if they aren't synched up, that information is still the next best thing.
     
  9. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,237
    Yeah I've been thinking about it. Just 1ms difference in ticks could mean you're missing information for up to 2 frames. It wouldn't work unless you were playing LAN and the server had zero issues.
     

Share This Page