Welcome to Unreal Aussies!

We are a community of like minded gamers in the Oceanic region.

We play a wide variety of games and provide a fun, social atmosphere for all our members!

How To: The Ultimate Video Recording, Encoding and Streaming Guide

Discussion in 'Tutorials' started by Agamemnus, Feb 14, 2017.

  1. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,528
    Over the next few posts I’ll take you through the main technical points of recording, encoding and streaming video, in particular game footage. Most people can set up scenes and webcams with just a little patience, trial and error. But so many people out there don’t understand some of the basic, yet crucial concepts that go on under the hood.

    If you’re reading this, you’ve undoubtedly heard of NVENC, Fraps, x264, DxTory, Shadowplay and a bunch of other technologies. In this guide, I’ll be focusing on what I think are the best, yet still pretty easy to use.

    Since OBS can do pretty much anything in regards to streaming and recording, we’ll be using that with a couple of side comments for FRAPS users and other encoding possibilities. OBS Studio is the current version, classic is being discontinued. Download it here in either classic or studio flavour:

    https://obsproject.com/download

    Update! OBS Studio now has buffer recording. The guide has been updated. There is now no reason to use OBS classic any more.


    What can Classic do that Studio can’t?
    • Buffer Recording.
    What can Studio do that Classic can’t?
    • Stream in one quality while record to HDD in a different quality.

    ^^^ This is no longer true ^^^


    The guide will contain the following posts:

    1 – This introduction.

    2 – Basic concepts.

    3 – Your choices and my recommendations.
    A - 3 video codecs.
    B - MKV vs MP4 vs WebM.
    C - Handbrake and AviDemux.
    D - Constant Quality vs Constant Bitrate vs Variable Bitrate.
    E - Considerations for the future.​

    4 – Encoding comparisons to help you choose.

    5 – Encoding the Video portion with Handbrake.

    6 – Encoding the Audio portion with Handbrake.
    A - Separating the myths from the facts.
    B – Definitions, formats and my choices.
    C – Examples of the Handbrake Audio Tab​

    7 – Streaming basics and a comparison – CPU vs NVENC vs Quick Sync.

    8 – Streaming examples for CPU, NVENC & Quick Sync.

    9 – Recording examples for CPU, NVENC & Quick Sync and recording while streaming.

    10 – Buffer recording and AVIDemux.


    If you have any questions or comments please reply below and I'll do my best to help out!!!
     
    #1 Agamemnus, Feb 14, 2017
    Last edited: Feb 14, 2017
  2. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,528
    2 – Basic concepts


    Quality = bandwidth * compression

    It’s a sordid state of affairs to see how many people on the internet don’t get this, so let’s get it out of the way straight up.

    The video on your screen is a collection of data displayed as pixels a certain number of times for a time. If you’re resolution is 1080p, colour is 10-bit and a frame rate of 25 then:

    (10-bit x 3 colour channels) @ 1920x1080 @ 25 fps = 185 MiB/s, or 667 GiB/h.

    185MiB/s as the bitrate unit in OBS is 1,515,520 kb/s which is well beyond what our ISPs provide these days.

    In order to put a “moving screen”, “motion picture” or let’s just call it “video” into a file it needs to be reasonably packaged. This is where compression comes in.

    Take this sentence:
    “Jason Stefanac is a seriously good looking gentleman. Ask yourself this, did he or did he not attract numerous women to him over the weekend?”

    This is an example of “Lossless Compression”:
    “BOSL is a seriously good looking g¢. Ask yourself this, did he or did he not attract n╝ w¢ to him over the wø?”
    The standard/format for this compression will state that:
    • BOSL = Jason Stefanac
    • g¢ = gentleman
    • n╝ = numerous
    • w¢ = women
    • wø = weekend

    Any decoder which supports this format can turn the compressed phrase into the original without exception. If they can’t, then they do not support the format. This is what happens when you write an essay and save it as a Zip file. Any Zip decompressor can take it and show you the original.

    This is an example of “Lossy Compression”:
    “BoslBe1HawtManC’monDidYaCHowManySxsHeGotOnWkend?”

    The codec for this compression will state many rules on how to compress and decompress, but ultimately, if you pretend the codec is a teenager on Facebook, you can get the message from the compressed version but never be sure that you have the original.

    Now note this, you can GUESS how to decipher the compressed message, but you CAN NEVER get the original back. It is now gone forever. If you were to decompress the short version to a long version, then compress it again, the new short version will be slightly different again, and each time you compress and decompress the message, you will lose something, until eventually it becomes static or “white noise”.

    Basically, lossy compression will remove SOME quality from the original, while lossless does not. Each time you compress/convert/recode with lossy compression you are losing more and more quality. This important fact will come into play later.

    So, say you have persons A and B. A has a bitrate of 120 kb/s and can compress a video to half it’s original, while person B has a bitrate of only 80 kb/s but can compress a video to one third of it’s original.
    Person A’s quality = 120kb/s divided by ½ = 240kb/s effective decompressed video.
    Person B’s quality = 80kb/s divided by 1/3rd = 240kb/s effective decompressed video.

    Even though person B has slower bandwidth, they are still fitting the same quality into it by using a stronger method of compression.

    The moral of the story is, better compression fits more quality into the bandwidth.

    Here’s a look at some quality benchmarking some guy did. You can see screenshots of the results from a link in his post:

    https://obsproject.com/forum/threads/obs-benchmarking-1080p-60fps-cpu-vs-nvenc-vs-quick-sync.15963/

    He’s testing what has been known for a while now, that using your CPU to power x264 encoding gives you the best quality for bandwidth, while offloading to NVENC or Quick Sync will free up some CPU if you need it, however because they don’t compress as effectively as the CPU, there is a drop in quality.

    The trick is finding the sweet spot. Encoders today are capable of H264 compression at such a powerful setting that the resultant 1080p video is encoded at a rate less than 1FPS. Meaning a 30FPS video that goes for one minute, would take over half an hour to encode. This is fine if you’re working with small clips and saving them to your HDD or uploading to YouTube. However for streaming this will not help you. You need to be able to encode as fast as you want the viewer to watch live. Looking at the table in the link above, you can see some of the quality settings caused “stuttering” and you’ll notice that the CPU usage for those was pretty maxed out. It means that the program had to drop some frames from the video so that the CPU had time to keep up with the encoding. The sweet spot is where your hardware is using as much power as it can to encode the video, without actually causing lag in the game or needing to drop frames. The most encoding power you can get, means the most quality goes into whatever bandwidth you’re putting out.

    Regarding saving your recordings and possibly uploading them to YouTube later, in principle here, we don’t have bandwidth restrictions. It’s possible that you do however have Hard Disk restrictions, but if you have a dedicated HDD for recording that won’t be a problem. HotS gameplay lossless in 1080p @ 30fps uses about 10-20 MiB/s and if you up that to 60fps you’ll see double that, which isn’t too bad.

    So considering this, we should record lossless. This means that you are actually capturing the original footage, so that it will be played back identical to how it happened on the screen. There’s more to consider, if you want to edit the videos, don’t encode them first, use the lossless versions. Also, encode to a YouTube supported format so that YT won’t transcode yet again. Worst case scenario is this:
    1 – Record original footage in lossy H264 (NVENC, QS or x264 preset) <- 1 encode.
    2 – Transcode to save space <- Now 2 encodes.
    3 – Edit and splice together clips, then export video <- 3 encodes.
    4 – Upload to YouTube and they encode to their format <- 4 encodes in total.

    Remember that each time this happens, you lose a bit of quality. A great recording can look kinda crappy by the time all this happens. What I do is:
    1 – Record original footage lossless. <- 0 encodes.
    2 – Merge videos together with AVIDemux <- 0 encodes.
    3 – Encode result with Handbrake top settings in YT compatible format <- 1 encode.
    4 – Upload to YouTube, doesn’t get processed because it’s already compatible <- Still only 1 encode.

    So that’s how planning ahead affects quality. Another thing to consider when providing video content is the data rate. A large data rate means that the files will take up more space on your disk, and also, that the viewer will have to download more to watch your video. In most cases, people wont have issues with this. However some viewers have terrible internet, while others have periods where their housemates or family have a Netflix party or something, possibly limiting the data rate to their own computer. If your video could be 100MB or 50MB at the same quality, then the 50MB option is undoubtedly better, because the viewer needs less buffering to play it. Also, YouTube might downscale their resolution from 1080p to 720p (half the quality) to accommodate the viewer’s bottleneck, meaning the viewer doesn’t get to see the best possible picture just because there was too much data for their internet at the time!
     
    #2 Agamemnus, Feb 14, 2017
    Last edited: Nov 13, 2017
  3. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,528
    3 – Your choices and my recommendations.


    A - 3 video codecs.

    Decent reference here: http://www.streamingmedia.com/Artic...Debate-Googles-VP9-Vs.-HEVC-H.265-103577.aspx
    And here: https://blogs.gnome.org/rbultje/2015/09/28/vp9-encodingdecoding-performance-vs-hevch-264/
    Netflix announcement here: http://techblog.netflix.com/2016/08/a-large-scale-comparison-of-x264-x265.html


    Option 1 – H264:

    H264 is widely used and gained most of it’s following because it is the successor to H263 and its relation to XVid and DivX. At the time of writing this, H264 is the current industry standard and is used on Blu-Ray discs. Of the 3 options here, it offers the lowest compression amount (with some room for wiggle) meaning that for a given quality, you use the most data. However it is compatible with YouTube standards and they will accept the video without necessarily recoding it into VP9, meaning your final product online hasn’t lost a bit more quality by the time the user sees it. As for what profile to use, High 5.1 is pretty solid. Most modern hardware decoders will do high profile 5.1, some don’t but can do it through software and when you watch on YouTube the app itself or Chrome/Firefox/Edge will do it through software, so as long as your CPU can keep up it’s fine. High 5.2 is the latest and would get a tiny bit more compression and allow higher FPS (4K @ 50fps compared to 5.1’s 4K @ 25fps) but again that doesn’t matter for software decoders only hardware and oddly enough, a surprisingly TINY amount of hardware devices decode 5.2, while 5.1 is pretty stock standard these days.

    • Pros: Fastest to encode, lots of free software for tinkering, YouTube doesn’t always recode, most compatible on devices.
    • Cons: Uses the most data.
    • Recommendation: Currently most people should use this for 1080p game footage. The other codecs get better quality when bitrates are tight, or save much more data when the footage is larger (4K 60FPS) but at 1080p 30fps this competes very well.


    Option 2 – H265:

    H265 is less widely used because it’s new. It is capable of much better compression so the data rates can be great, but because of the processing power required to do this, encoding times are much longer and some smaller or older devices aren’t fully capable of properly playing back this video because even the decoding power required is pretty decent. Finally, because YouTube doesn’t play H265 videos, if you upload one, it will be converted into H264 and VP9 by YouTube, meaning you lose a little quality there for the final viewing. Some hardware encoders will “encode” H265 but they are pathetic at it and it’s not worth looking into. Regarding profile, at the time of writing this it’s basically Main or Main10. Main is just fine, you can however use Main10 if you like but it doesn’t make a difference in my opinion. There isn’t much above that since the format is still pretty new and since we’re lucky enough to have hardware designed for playback already it’s a no-brainer.

    • Pros: Small data rate, handled by Handbrake well.
    • Cons: Plays back poorly on old or weak devices, loses quality when uploaded to YouTube due to second conversion.
    • Recommendation: Use for backing up your DVDs. Over 25% disk space saving on Ghost Whisperer Season 5 Disk 6 at RF20 compared to H264. 1080p gets you similar data rates for transparent quality.


    Option 3 – VP9:

    Google invented VP9 as a successor to H264 and an alternative to H265. It’s free, so no licensing fees for hardware or software companies. Slowly but surely, YouTube are converting all of their videos into VP9, leaving only the most compatible H264 ones until last. On the plus side, compatible VP9 videos are not currently getting converted when uploaded to YouTube. Their data rates are just as good as H265 but for some reason I don’t fully understand, the quality and stream rate is a little more reliable. Also, decompression takes less processing power, so it works on more devices. Not to mention it’s already built into Google Chrome and Mozilla, as well as any YouTube app. Profile wise not many devices support hardware decoding of VP9 but it’s pretty irrelevant because the decoding is so efficient even weak CPUs can do it through software. The main drawback is the difficulty in creating these videos. Handbrake can do it, but the libraries it uses only use a single thread. Meaning H265 will use all 12 threads on my CPU taking it to 100%, but VP9 only clocks out at 10% total CPU usage and takes about 5-8 times longer to do. This may change with future releases but for now we’ll focus on FFMPEG use through the command line. All the facilities this codec has to offer are not yet unlocked, so the data rate isn’t as good as it COULD be yet, but it’s on the way. Ultimately, the difficulty in getting smaller files for same quality is too great for me to recommend this. If you’re doing 4K@60fps or doing lower res at TEENY TINY bitrates then by all means use it and you may see PLENTY of benefits. However for the purposes of this tutorial I recommend don’t bother (see section E for why).

    • Pros: Good quality and small data rate, widely compatible on devices, least complex to decode, currently doesn’t get converted on YouTube.
    • Cons: Underdeveloped therefore takes much longer to encode, difficult to use and can’t make use of full compression facilities yet so often files aren’t even smaller than H264 when made on home PCs.
    • Recommendation: If you felt the need to read a guide about this then it’s probably beyond you. VP9 was made by Google to be single threaded. YouTube encodes thousands of videos simultaneously. So while we want to use say 8 threads on encoding a single video, they would use 8 threads on 8 videos. This improves speed for them because they don’t have to divide-conquer-reconstruct the videos. End result is that it’s not designed for the home user and it shows. If you’re a fanatic about quality then by all means use this as I’ve included an FFMPEG part. Otherwise, leave it alone.


    B - MKV vs MP4 vs WebM.

    The reason why I say use MKV is because frankly, it’s the future. MP4 containers can contain a certain amount of stuff. MKVs can contain more. They can have a menu (apparently), multiple camera angles that can be displayed simultaneously or switched between with the viewer’s remote control, they can have audio encoded in Opus… the list goes on. The number of features available in MKV containers is so much greater than any other container, they will ultimately outlast anything else that currently exists. That’s why we should start using them now. If you have a gameplay video you want to keep, is there really an advantage in using MKV over MP4? No. But why make a habit now that you’ll one day change?

    As for WebM containers, they are basically just a cut-down version of MKV. Take a free video codec stream like VP8 or VP9, with a free audio codec stream like Opus or Vorbis, put them into an MKV container. This is the effective definition of WebM, if you rename the rile extension to WebM then it's totally valid. It's just an MVK file where all the contained streams are free. Ideally, all files will be WebM, but people use royalty codecs, so just use MKV for consistency, because it accommodates both.

    So I say use MKV, because why use any other container?


    C – Handbrake and AviDemux.

    I love Handbrake. It works on Windows, Mac and Linux. It offers very up to date options including one of the most recent, audio as Opus in MKV containers. Since this tutorial focuses on getting great quality, it makes sense to use Handbrake because each user can tweak the settings to really push their individual computer to get the best videos they can.

    Regarding Premiere. Down below in section 4 of this guide, I’ve taken a video and encoded it a bunch of different ways to compare encoding settings. You’ll notice there’s only one setting for Premiere listed. It was the BEST compression that my version of Premiere could do. Every setting was turned up to the top and it just couldn’t compare to the Handbrake encodes. Sadly, Handbrake is not a video editor. To do anything remotely fancy you really need to consider using something like Premiere. But if you care about your quality, export your work as lossless, then encode that lossless video into your final file format using Handbrake. What I do is, I use AviDemux to join the “uA Intro” movie onto the start of my game footage, and to cut out any unwanted parts of the video (like pauses or between matches). AviDemux can clip, append and edit video without removing quality but instead by literally copying the streams bit by bit into new files. Premiere can’t do this, by it’s nature of allowing complicated editing, it requires recoding the entire video which loses quality every time that happens.

    The main drawback to Handbrake in my opinion is if you want to go down the VP9 route. The libraries for VP9 are still in development and on their official website they state that you should expect slow encoding times for the near future. The Handbrake devs aren’t a fan of this, so they have only put in a small amount of effort into the VP9 side of things. You can still do it of course, but it’s extraordinarily slow and the options aren’t very customisable. So for this guide at the time of writing, I’ll be showing you VP9 through the command prompt with FFMPEG.


    D - Constant Quality vs Constant Bitrate vs Variable Bitrate.

    This guy provides an excellent explaination:
    https://mattgadient.com/2013/06/12/a-best-settings-guide-for-handbrake-0-9-9/

    TLDR? Well, here’s the scoop…

    When you use a Constant Bitrate or “CBR” you apply a certain number of bits per frame. Frames can be grouped to provide better compression, but each group will have the same amount of data. This means that when the video doesn’t need many bits for a while it will still use them and have fantastic quality, but when it needs extra bits (like in fast moving scenes) it won’t get them and you’ll see a quality drop.

    When you use a Variable Bitrate or “VBR” you’re specifying an average like in CBR but there’s room to move. Some VBR codecs will let you specify a target average of say 8000Mbit/sec PLUS OR MINUS 1000Mbit/sec. So in these cases, sometimes the codec will use less data when it doesn’t need it, but allow some more bits when the video really does. This causes less fluctuation in the quality but still maintains the target file size.

    Both of the above are examples of Average Bitrate or “ABR” and is sometimes called “Target Bitrate”.

    ABR gets huge benefits from using 2-pass encoding. The first pass analyses the video to find all the places where it needs the most data and where it needs very little. Then on the second pass, it remembers these details and applies a Variable Bitrate more effectively because it “knows the future”.

    Constant Quality or “CQ” is different. You specify the QUALITY you’re after then it will use the bits it needs to in order to achieve that. The profile you set (or individual parameters) will determine how hard your encoder can try to get the quality it’s looking for. For example, it can search for same-colour pixels within a 16 pixel radius, or a 24 pixel radius. If the encoder achieves the quality it is set to, it can keep looking around to find ways to do it with less data until it reaches the parameter/preset limit. If it can’t compress the video well for a while, it will just use more bits to maintain the quality. It doesn’t need 2 passes to do this, 1-pass will always work.

    The link above contains the following example for a TV series:
    • Constant Quality RF22: Episode 1 = 278MiB, 2 = 349MiB, 3 = 363MiB, 4 = 304MiB
    • Average Bitrate 798kbps: Episode 1 = 323.5MiB, 2 = 323.5MiB, 3 = 323.5MiB, 4 = 323.5MiB
    Both methods can achieve a total size of 1294MiB but in the ABR method, Episode 1 got more bits than it needed, 2 & 3 didn’t get enough and 4 was probably pretty close to right.

    So with ABR, Episode 1 looks great! 2 & 3 however are below the standard we want.

    With Constant Quality, we always get the quality we planned for!!!

    There’s more information on CQ in another post by the same guy:
    https://mattgadient.com/2014/01/06/handbrake-rf-slower-speeds-craziness/

    TLDR? What he explains here, is that when you use a fast preset on your encoder, it winds up with a large file size. Slower presets will get you smaller file sizes. The preset tells your encoder how hard to try to get the quality, so even if it reaches the right quality but still has time to search, it will try find ways to decrease the data required even further.

    But there’s an anomaly, sometimes using slower presets (or larger parameters) you will get a LARGER file size! Why? It’s because sometimes with CQ the encoder will search within the parameters you allowed it but couldn’t find enough quality, so it just saves the frame. But if your parameters are searching far and wide for more options, it can locate tiny pieces of extra quality to fit in and once it finds them, it will use however many bits it needs to fill the target.

    In the end, he says that the “Placebo” preset in Handbrake x264 is actually about 0.25 better quality on the scale than its nearest setting “Very Slow”. In my examples in the next chapter, I can see this happening occasionally. Also I use a setting called “Max” which isn’t an x264 preset, but rather a customised list of parameters filled out to their highest values, with maybe one or two exceptions for sanity. Almost every video I have ever tested with “Max” settings looks unbelievably good on RF20 and sometimes the file size is larger than Placebo to allow this, but sometimes it’s smaller because it found a better way to compress.

    You can see this guy’s results at a few RF levels in the video below:


    There’s an additional detail. Quantization Parameter or “QP” is where every frame is the same quality while Constant Rate Factor or “CRF” will apply lower quality to fast motion and higher quality to low motion. The strategy is to make it more appropriate to a human eye, which sees fast motion as a blur (so CRF will allow blurring) but focus on still images. Most people believe that the x264 CRF values are generally 2 levels of quality better than QP, but that’s largely because of the calculation difference. The end result is that a CRF video can look to humans the same as a QP video but with less data. To a computer they don’t look the same, but to a human they do. This isn’t like the audio purist argument of “lossy sound is still lossy”. QRF makes videos look MORE like real life than QP to humans. Compared to QP, CRF will perform better on focused still motion where humans will notice an improvement, yet allow blur where humans will expect it, so in the moments where you are watching and CAN see a difference, CRF is better. There’s a good explanation here: http://slhck.info/articles/crf

    My recommendation: Constant Quality. Specifically QRF. We want the video to look up to standard, bitrate is a secondary priority. Of course, because it’s only 1-pass we also save encoding time. The rest of this guide will usually reference this (with the exception of the Premiere encode which didn’t have this facility and streaming where it's not appropriate).


    E – Considerations for the future.

    Sadly, nothing lasts forever. The Alliance for Open Media consists of Google, Amazon, Cisco, Intel, Microsoft, Mozilla and Netflix. This AOM wants to finalise a bitstream format called AV1 by the end of March 2017. Within a few months after that, we will see new codecs released and YouTube will proceed to convert their videos to this new format. Future wise, it will allow higher resolutions at greater colour depth (12-bit last time I checked, for 4-16 times as many colours) and require considerably less data than VP9/H265 at 4K@60fps (they’re hoping for 50% less). In terms of today’s videos, the savings might not be as much, but they will be there to some degree. It will also be compatible with HTML 5, WebM and Opus so those things will not change.

    Here’s my prediction… H264 video will be the first to go. YouTube will convert all their H264 video straight to AV1 and they will start doing so within 3 months of the bitstream being finalised. Then, they will start with the top resolution popular VP9 videos. Anything over 1080p will go first, then they will start climbing down the quality ladder. By the time they get to 720p most of your videos will be out of date and we will all be encoding greater than 1080p in AV1 anyway so no biggie. But bear in mind, if you’re uploading at H264 now, your videos will be the first to get recoded and lose some quality.

    Stay tuned here for updates regarding AV1 :emoji_slight_smile:

    My final recommendation is to encode H264 RF 20 at Very Slow for uploads to YouTube and if you wish to keep a copy on your own computer, make a separate H265 for your own collection. It’s a lot of encoding, but it will save you Hard Disk space. If you’re not worried, then just use the H264 for both.
     
    #3 Agamemnus, Feb 14, 2017
    Last edited: Apr 5, 2018
  4. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,528
    4 – Encoding comparisons to help you choose.


    These 2 reviews are pretty good:
    http://www.videoquality.pl/preset-settings-x264-quality-compression-speed-test/
    https://mattgadient.com/2013/06/12/a-best-settings-guide-for-handbrake-0-9-9/

    I took a video and encoded it with various settings to show the difference in encode time and final file size. Most of the videos are such good quality that I had to intentionally look out to see the difference from the original except for the Premiere version which was much more obvious.

    For testing purposes I put together some files recorded with OBS Lossless 1920x1080 Progressive 25-30FPS YUV 8-bit colour Profile High 4:4:4. They vary in time, specified at the top of each part. The audio component is specified too so that the final file size can be compared more accurately. Audio encoding could be passthrough, which means that I’m only encoding the video, the audio will remain identical to the OBS recording bit for bit. If audio encoding is specified, then the encoding FPS will show as ever so slightly slower than the true value. Basically though, you can take these numbers at face value.

    Encoding method = average FPS encoded, total time taken, final encoded file size.


    Heroes of the Storm original 2.17GiB, 201 seconds, 30fps, audio passthrough component is 5MiB:



    HandBrake x264:

    RF 20 Slower Profile H5.1 = 24 FPS, 251 seconds, 205MiB. Tiny amount of distortion.
    RF 21 Very Slow Profile H5.1 = 17.5 FPS, 371 seconds, 168MiB. Ever so slightly more blur than RF20 Very Slow.
    RF 20 Very Slow Profile H5.1 = 17.5 FPS, 375 seconds, 193MiB. Nearly impossible to see faults.
    RF 20 Placebo Profile H5.1 = 5.7 FPS, 1057 seconds, 194MiB. Nearly impossible to see faults.
    RF 20 Max (see below) Profile Auto = 1.6 FPS, 3768 seconds, 197MiB. No visible faults.


    Handbrake x265:

    RF 23 Very Slow PM = 1.0 FPS, 6100, 105MiB. Can see some blurring around heroes and effects. Minions also blocky.
    RF 22 Very Slow PM = 0.9 FPS, 6800 seconds, 118MiB. Same blur as above but less noticeable.
    RF 21 Very Slow PM = 0.9 FPS, 6800 seconds, 133MiB. To me appears identical to RF22.
    RF 20 Very Slow PM = 0.9 FPS, 6800 seconds, 150MiB. Slightly better than RF22 but still blurring.
    RF 19 Very Slow PM = 0.9 FPS, 6800 seconds, 168MiB. Appears about the same as 264 RF20 Very Slow.
    RF 17 Placebo PM10 = 0.5FPS, 12200 seconds, 211MiB. Looks the same as 264 RF20 Max, no visible faults.


    FFMPEG VP9:

    CQ 20 Speed 0 = 0.6 FPS, 10100 seconds, 198MiB. Some blocking on grey ground but pretty good.
    CQ 22 Speed 0 = 1 FPS, 6050 seconds, 183MiB. Same blocks as above but bit more noticeable. Similar issue with the ground as Premiere version but not as frequent.
    CQ 22 Speed 0 ”aq 1” = 0.9 FPS, 6720 seconds, 210MB. Slight blocks at curved fog of war otherwise fine.
    CQ 26 Speed 0 = 4 FPS, 1510 seconds, 147MiB. Included this one because I noticed something interesting. Compared to x265 RF22 at 118MiB, this one has worse quality on the ground where the colours are dark and blending, but better quality on the heroes and minions where the colours are sharp and contrasting. Not the fantastic quality we’re after here, but worth putting out there.


    Premiere H264:

    Maximum (see below) = 33 FPS, 181 seconds, 199MiB. Most distortion of all videos.

    Premiere encoding has less options. I went for H.264 29.97 FPS, Progressive VBR 2-pass, Target 8Mbps, Max 11.01 Mbps, AAC 320Kbps, 48KHz Stereo. Also ticked Render at Maximum Depth AND Use Maximum Render Quality.


    Shootmania original 1.82GiB, 31 seconds, 30fps, audio PCM encoded to Vorbis 0.5MiB


    HandBrake x264:

    RF 20 Very Slow Profile H5.1 = 4.4 FPS, 212 seconds, 79.5 MiB. Some blockiness during fast motion and also in the grey clouds. Slight blur on the brick wall during motion.
    RF 20 Max (see below) Profile Auto = 0.8 FPS, 1162 seconds, 64.6 MiB. Tiny amount of blockiness during fast motion and a little in the grey clouds.


    Handbrake x265:

    RF 20 Medium PM10 = 10 FPS, seconds, 84.6 MiB. Tiny amount of blockiness in the clouds.
    RF 20 Very Slow PM10 = 4.4 FPS, 215 seconds, 79.6 MiB. Nearly impossible to see faults.
    RF 20 Placebo PM10 = 1.3 FPS, 720 seconds, 80.2 MiB. Quality is sensational, no visible faults.


    FFMPEG VP9:

    CQ 20 Speed 0 = 0.5 FPS, 1866 seconds, 84.2MiB. Quality is sensational, no visible faults.


    Overwatch original 5.98GiB, 1449 seconds, 25fps, audio passthrough component is 36MiB


    Handbrake x264:

    RF 20 Very Slow Profile H5.1 = 4.2 FPS, 8630 seconds, 1.71GiB. Can see blockiness on surfaces where colour blends together. However most of the video is pretty amazing quality.
    RF 20 Placebo Profile H5.1 = 1.3 FPS, 27888 seconds, 1.71GiB. Looks the same as Very Slow.
    RF 20 Max (see below) Profile Auto = 1.1 FPS, 33232, 1.71GiB. Same blockiness, but ever so slightly less obvious. Possibly in my imagination…


    Handbrake x265:

    RF 20 Medium PM10 = 8.1 FPS, 4520 seconds, 1.35GiB. Appears identical to 264 Very Slow above.
    RF 20 Slower PM10 = 1.0 FPS, 37334 seconds, 1.29GiB. Also identical.


    FFMPEG VP9:

    CQ Speed 1 = 1.8 FPS, 20912 seconds, 1.49GiB. Appears the same as the x265 versions.
    CQ Speed 0 = 0.9 FPS, 41252 seconds, 1.42GiB. Appears the same as the x265 versions.


    NOTES:
    • All sources are in 1080p lossless and either 30 or 25 FPS. This is to represent the real-world likelihood of what you, dear reader, are most likely to want to record. The results and differences are vastly varied on different types of objectives. For example, ABR at 4Mbit/sec produces significant quality drops in x264. That’s an issue Netflix and YouTube need to consider, however not us. Also, results start to vary again when considering videos in 4K @ 60 FPS, which we might update to deal with in another year or so.
    • Heroes of the Storm is the only source where changing the speed also significantly changed the quality for the same RF. Considering HotS is a much more static game than Shootmania or Overwatch, this fits in with the general idea of RF, where mistakes are harder for a human to see during high motion but easy to see on static images.
    • Heroes of the Storm is also the only source where x265 and VP9 didn’t offer any real file size benefit without sacrificing quality. This fits with the idea that still images are easier to compress, while the fast moving images require more work and an advanced encoder will do a better job.
    • Heroes of the Storm is once again, the only source where the vast majority of encodes are able to be done in less than 1MiB/sec or 8Mbit/sec. Again, referencing the stationary screen play style of the video.
    • For Shootmania, the x265 RF 20 Very Slow encodes at the same FPS as the x264 version but somehow ends up with nearly exactly the same file size. In my opinion the video is clearer but that’s very subjective. There’s no point keeping an x264 version, get into the habit of keeping x265 versions now. Uploading to YouTube however, use x264.
    • Shootmania is the only source where VP9 stood out as better. The final file size isn’t the best but it’s very close. I can’t put my finger on it, but when watching that one after watching an x264 or x265 version it just somehow looks cooler. I’ve tried to screenshot frames to see what it is, but frame-to-frame it’s not visible. You have to do it yourself to see if you see the same thing I do. Sadly, I have nowhere to put these videos for direct download :(
    • Overwatch had an interesting result for x264. The different presets produced files of identical size DOWN TO THE BYTE. I should have done a checksum on them but deleted 2 of them before I thought of it. I’m willing to bet they are identical, although something about the “Max” version made me feel like it was clearer but I can’t prove it and it would make more sense that they’re identical.
    • Overwatch is where x265 really shined. With the Medium preset encoding faster than all of the x264 versions and STILL producing a smaller file size. If you have the patience, use Slower. I thought about doing a test with Placebo but my source file is over 24 minutes long and I have been encoding for days now, don’t want to spend another 2 days just on that…

    Regarding HandBrake x264 RF 20 quality HotS video, I could only see the tiniest amount of distortion around diablo doing a merc camp for one moment. The rest of the video looks like it does when playing the game, fantastic. The distortion is noticeable in the Slower encode, then hardly there in Very Slow or Placebo. Max there is no distortion whatsoever, the quality appears perfect to my eyes and believe me, I looked over it many many times. Premiere quality however had something that really bugged me. Often, when the screen moves, there’s a slight pixilation applied to the ground, which comes good after about ¼ to ½ of a second. In my opinion the quality is less than any of the HandBrake encodes, which makes sense, because it happened much faster.

    If you want to see the quality difference I’m talking about, pull up these 2 videos:
    Handbrake Very Slow -

    Premiere -


    Make sure you set both videos to 1080p and then move to the 1:15 mark and let it buffer. At the 1:19 mark, right after I scream “YES” and just before Pointy laughs, the screen gets shifted right. Pay attention to the lines/cracks on the ground. They go blurry with the screen movement, then a fraction of a second later, they clear up and the colour of the ground enriches. This doesn’t happen on the HandBrake encode, which seems much more consistently crisp. It can be hard to tell the difference, but some people notice it more easily than others.

    Regarding the quality for VP9 in the HotS video, the blocks around shadows I refer to are pretty minor. At a higher CQ I had blur around finer lines like cracks in the ground so I discounted them. At CQ 22 and 20 that blur is gone. But as characters move around the fog of war moves as a circular shadow. At least one time I could see the edge of the fog of war block up as it slowly crept over grey ground. Didn’t happen on brighter maps, but in this video I could see it on the demon side of the map. It’s pretty minor but worth mentioning.

    What difference does this all really make? Well, if after a year you have 100 hours of amazing moments they could occupy 3.6TiB of HDD or they can occupy 3.2TiB. Small saving? Yeah kinda. But what are you really sacrificing here? A 10 minute clip might take 5 hours to encode, easily done overnight. But there’s one more thing to consider, the viewers who watch this. They need to DOWNLOAD the video right? If their connection is not quite fast enough, they might have to buffer. If the file size is smaller, then they might not have to. Could it make a difference? Who knows for sure… But if you can deal with encoding while you’re asleep or at work, I recommend doing the slowest method you can bear. That’s what I do.
     
  5. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,528
    5 – Encoding the Video portion with Handbrake.


    Get your copy of Handbrake here:
    https://handbrake.fr/downloads.php

    My version is 1.0.1 at the time of writing. If the appearance changes significantly I’ll update screenshots but you should be able to figure it out.

    So, open it up and pick your source file or disk. If you choose a disk or folder the top left “Title” dropdown will have multiple choices. You can “Add to Queue” for ALL titles if you have a default set for the way you want them all. Or you can individually select the titles and specify their settings then add to queue individually.


    Picture Tab:

    [​IMG]

    OK you want your source to show the resolution your original video is in. I recommend choosing Anamorphic = None and Modulus = 2, then go over to cropping and make it “Custom” and set all the numbers to 0. Sometimes if you’re encoding a source with a different aspect ratio, this tab will be the cause of squishing or stretching. So here is where you address that.

    Anamorphic is a way for Handbrake to automatically get rid of the black bars around your video. It will set cropping values for you as multiples of the “Modulus”. When a video has black bars, it will take up more space on your HDD than it really needs to. Take the black bars away, the file is smaller, and when it plays on a screen of different aspect ratio your screen will add the black bars itself. Handbrake is pretty good at this but SOMETIMES it gets it wrong. Game recorded footage should have no black bars and I like to force “No Cropping” with these settings just to avoid the possibility of problems.


    Filters Tab:

    [​IMG]

    Turn them all off. Unless you know your video is interlaced, then you can set it to “Decomb” using “Default” Preset. Interlace Detection will come up if you do this, and you can make it automatically detect Interlaced videos, but I don’t.


    Video Tab:

    [​IMG]

    This is where the magic happens. Set your video codec and hope that you picked one based on some of the tests I provided above. Generally, you might need to experiment a bit to find what encode speed you’re comfortable with, I go for the longest I can bear. So, set the following:

    Video Codec = x264 or x265. Don’t use the others, VP9 is so much faster in FFMPEG and even there it’s slow.

    Framerate = Same as source, Variable. Because you don’t want to lose frames or make up new ones.

    Use Advanced Tab Instead = Off to use the presets on the slider, On to make the Advanced Tab come up for customisable options (see below).

    Encoder Preset = How fast you want it to happen. The slower it goes, the smaller the file size will be (exceptions happen, see above section 3-D for explanation)

    Encoder Tune = None. This will tune the final result to perform better on particular tests. Don’t do it, we just want it to look good to us, not to a computer.

    Fast Decode = Off. This lets the video play on devices with weak CPUs. But it limits the compression technologies. If your device can’t play it, get a new device or turn down the resolution on YouTube.

    Encoder Profile & Level = High 5.1 for x264. Main for x265.

    Extra Options = Empty, completely blank. If you’re reading a guide on how to do this, then this box is not for you just yet.

    Quality = RF 20. You can fiddle with this if you think I’m too picky with my video quality. It’s not a big deal, you can always make a “small copy” and a “better copy” if you’re thinking upload times and such. Just whatever you do, use Constant Quality, not Avg Bitrate.

    Audio Tab is covered in the next section.

    Subtitles Tab is beyond the scope of this guide. Needless to say, if your original contains subtitle tracks, you can add them to your final file. MKV containers can hold any number of subtitle tracks within them. You can even take an external subtitle file like .srt and embed it as a track in the MKV file.

    Chapters Tab again is beyond ye olde scope. But, if you like, you can insert chapter markers so that during playback, the chapter forward/back button will skip to that point. This might be good for long videos with several matches/segments.


    Advanced Tab:

    [​IMG]

    The settings in the screenshot are what I use on my “x264 Max” settings in the encoding comparison examples above. Feel free to adjust it as you see fit. You can hover over each field for a short description. Bear in mind that most of these settings need to be pretty high for it to be better than Very Slow or Placebo. The difference between this and Placebo is that the Motion Estimation Range is much higher in Max.

    Each of the presets on the Video Tab correspond to a particular combination on the Advanced Tab. You can see what they are at this link: http://dev.beandog.org/x264_preset_reference.html


    FFMPEG for VP9:

    If you really want to go down this path, get it from here: https://ffmpeg.org/download.html

    Go into your installation directory, then into the “bin” subdirectory. Make a text file and save it as something like “go.bat” but make sure to save as type “all files” so that it will register as a batch file. Then in there put this text:

    Code:
    ffmpeg -i input.mp4 -c:v libvpx-vp9 -pass 1 -b:v 0 -crf 20 -threads 8 -speed 4 -tile-columns 6 -frame-parallel 1 -an -f webm temp.webm
    
    ffmpeg -i input.mp4 -c:v libvpx-vp9 -pass 2 -b:v 0 -crf 20 -threads 8 -speed 0 -tile-columns 6 -frame-parallel 1 -auto-alt-ref 1 -lag-in-frames 0 -c:a copy output.webm
    
    This will take a file called “input.mp4” and encode it to a file called “output.webm” and it will passthrough the audio because of the “-c:a copy” switch.

    You can activate a mode called AQ, which I won’t go into here. But it will improve quality and increase file size so it may or may not be useful to you:

    Code:
    ffmpeg -i input.mp4 -c:v libvpx-vp9 -pass 1 -b:v 0 -crf 20 -threads 8 -speed 4 -tile-columns 6 -frame-parallel 1 -g 9999 -aq-mode 1 -an -f webm temp.webm
    
    ffmpeg -i input.mp4 -c:v libvpx-vp9 -pass 2 -b:v 0 -crf 20 -threads 8 -speed 0 -tile-columns 6 -frame-parallel 1 -auto-alt-ref 1 -lag-in-frames 0 -g 9999 -aq-mode 1 -c:a libvorbis output.webm
    
    When you’re ready and saved the batch file, use the command line to run it, or just double-click it in Windows. It looks like this if it’s working:

    [​IMG]

    NOTE: You may have noticed I’m doing 2-pass here, which technically isn’t required for Constant Quality. I don’t know why, but for some reason using the libvpx codec with FFMPEG performs better in terms of quality AND file size when you do it. I know, it’s not meant to, but it does. Don’t ask, this part is already complicated enough. You can just do the first pass at a high-speed, it will create the temporary file and then the second pass is the important one so that’s at speed 0.
     
    #5 Agamemnus, Feb 14, 2017
    Last edited: Feb 14, 2017
    • Like Like x 1
  6. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,528
    6 – Encoding the Audio portion with Handbrake.


    A – Separating the myths from the facts.

    There’s a term called Audiophile. It refers to people who “care” more about audio than your average bear. All over the internet you can find examples of Audiophiles saying that this is better than that, pointing out that there’s a difference between “most humans” and those with “golden ears” or whatever topic is in fashion at the time. By and large, it’s rubbish.

    A great old example was in 1984 when CD technology was first invented. A gentleman named Ivor Tiefenbrun was making claims about what was ruining sound. Ivor was instrumental in developing high-fidelity audio equipment throughout the 70s and 80s and received an “Order of the British Empire” by Queen Elizabeth II in 1992. He was a renowned Audiophile who said that digital technologies were not able to reproduce audio as well as analogue could. He was challenged to prove this by listening to a record play through his own brand’s equipment with a switch. In the A position, the audio went entirely through his own equipment. In the B position, the audio went into a Sony PCM digital box, was converted into a digital signal, then converted back into an analogue signal and played through the rest of his equipment. In the X position, the box would be in either A or B, but the user would not know which one. The user was freely able to switch between A/B/X as many times as they wanted before making a choice by saying X matched A or B.

    http://www.bostonaudiosociety.org/bas_speaker/abx_testing2.htm

    The results? Absolutely no ability to differentiate the two. The PCM digital box did not affect the sound enough for Ivor to tell the difference. And as the tester points out, it was effectively only using 13-bits of it’s 16-bit capability!!! If the founder a of top quality audio equipment company, awarded by the Queen, could not hear an effect of digital coding at 13-bit depth vs the original analogue audio, then who can?

    Sadly, people everywhere claim things like “I can tell the difference between 16-bit and 24-bit” or “people with golden ears can tell the difference between 44.1KHz and 192KHz”. They are wrong. Not just subjectively wrong, they are fundamentally wrong.

    It stems from the idea that digital audio is stored as blocky steps instead of a smooth curve.

    [​IMG]

    People think that if you make the blocks smaller, the audio quality will be better. This is mathematically incorrect. The Nyquist-Shannon Sampling Theorem is the basis for digital audio. It doesn’t “explain” digital audio, rather, digital audio was created AROUND the Sampling Theorem. It states that there is a point at which a wave can be reproduced PERFECTLY. For human ears, this is 40KHz. CDs sample audio at 44.1KHz just to be “safe” and some people distribute audio at 48KHz just to be “safer”. There is a reason to use higher sample rates when you’re an engineer inventing new sound effects, but once the sound effect is finalised, playing it at 44.1KHz is identical to playing it at any higher sampling rate. The Guiness book of world records 2017 states that the human limit is “almost 20KHz” which is perfectly sampled by a mere 40KHz, 10% lower than that of a CD.

    So that’s the sampling rate part, what about bit-depth? Well that’s to do with dynamic range. The way digital audio works, errors or inaccuracies in bit-depth are dithered to produce white noise in addition to the sound wave. This white noise at 16-bit is pretty small. The white noise at 24-bit is 256 times smaller. The reality of this difference is akin to listening to a guy jackhammer the road 1 metre away and still knowing that behind him there is either 1 or 2 mosquitos buzzing around. People who claim they can hear the difference are mistaken.

    Here’s a nice bullet point list about fact vs myth:
    • Your ears do not hear the “blockiness” of digital audio. The “blockiness” of digital audio limits the frequencies at which it can PERFECTLY reproduce a sound wave.
    • A human who can hear frequencies not allowed by a 48KHz sample rate is equivalent to a human who has 4 cones in their eye retina instead of the usual three, and can see x-rays as easily as the rest of us can see the colour green. Or a human who has eyes in the back of their head to see predators coming. In 100 years of searching for these people, neither one has ever been found. The “golden ears” are equally as rare as eyes in the back of your head.
    • The Guinness book of world records 2017 states that the human limit is “almost 20KHz” which is perfectly sampled by a mere 40KHz, 10% lower than that of a CD.
    • A guy awarded by the Queen for contribution to audio electronics claimed he could hear the difference between original sound and 16-bit 44.1KHz digital sound. He put his reputation on the line to participate in a test that proved he couldn’t do it even with 13-bit digital sound in a noise protected studio under controlled conditions. At least he had the guts to do it.
    • The noise-floor of a recording studio is usually around 30dB and the dynamic range of a 16-bit CD is up to 120dB (some dithering enhances this to 150dB) meaning that to hear the white noise in a soundproof recording studio, the loudest part of the CD is 150dB, enough to cause permanent hearing loss in seconds. As loud as that of firing a sniper rifle.
    • The same point above for 24-bit audio is 194dB where 180dB is enough to kill any human being from impact by pressure from the sound wave. In fact, people have been killed by 160dB.
    • Just to reiterate the above 2 points. To hear the white noise in 16-bit digital audio the volume needs to be as loud as if you were firing a sniper rifle, and in 21-bit audio to hear the white noise you would need to turn the volume up enough TO ACTUALLY DIE. Yet Apple sells 24-bit audio files and people everywhere swear they can hear the difference.

    Some referencing if you would like to know more:
    http://www.head-fi.org/t/415361/24bit-vs-16bit-the-myth-exploded
    https://people.xiph.org/~xiphmont/demo/neil-young.html
    http://www.animepassion.net/topic/2...ot-lossy-and-why-24bit-flac-is-not-necessary/


    B – Definitions, formats and my choices.

    Transparent – Where the encoded LOSSY sound cannot be told apart from the original. Bear in mind that if you encode a sound 100 times “transparently” you will end up with garbage if the encoding is lossy. So this can usually only be done once or twice.

    Passthrough – Where the audio in the original is passed directly into the encoded file. You’re encoding the video, but keeping the audio unchanged.

    PCM – This is an uncompressed pure WAV or CD style format. The raw audio. It can have a bit depth and a sample rate, but almost always they are 16-bit 44.1KHz. You will have these if you recorded gameplay with Fraps or maybe some other software like DxTory.

    Dolby Surround – Technically this is stereo. But while stereo is 2.0 channels for 2 speakers, Dolby Surround takes into account speaker separation. So stereo works great on headphones, Dolby Surround will sound different on a speaker setup where the speakers are in corners of the room. You can setup your home theatre system to know the distance between the TV and each speaker (they could be at different distances) and it will use Dolby Surround encoding to sync them in such a way that you hear what the author originally intended. Stereo alone cannot do this. However for the purposes of this guide, we won’t really talk about Dolby Surround as being different to stereo, and all 2.0 channel might just be called stereo for simplicity.

    AC3 – Also known as Dolby Digital. This is what you usually get on a DVD and it’s usually in 5.1 channels (6 channels) at 384Kbit/sec giving 64Kbit/sec to each channel. Dolby TrueHD is a lossless version in 96KHz 24-bit sometimes found on Blu-Ray and always on HD-DVD.

    DTS – Once called Digital Theatre Systems it’s an audio codec that was a competitor to Dolby Digital. DTS is widely accepted to be slightly better than AC3 and current Blu-Ray discs often contain DTS MA which is a lossless version, up to 192KHz in 24-bit.

    AAC – Apple’s codec of choice (because they made it). Usually considered transparent at 128Kbit/sec stereo, or 64Kbit/sec per channel.

    Opus – Fully sick new codec that is usually considered transparent at 112Kbit/sec stereo or 56Kbit/sec per channel. Outperforms AAC, Vorbis and MP3 at all bitrates up to 112Kbit/sec after which point they start all sounding like the original, except MP3 which needs more like 160kbit/sec. Also quite smart, can use unused bits from some channels to support others that require help. Latency is also much better with Opus, but that’s not really related to this guide. Finally, if you look up YouTube’s recommended upload specs, they say MP4 with AAC audio. But, if your video becomes popular or stays up for long enough, they actually convert it to WebM with Opus audio. Upload with Opus audio at the start and the only conversion that happens is they will create lower bitrate versions for viewers with poor internet connections. Your top quality video/audio streams will remain unconverted.

    Don’t believe me about Opus? Check these blind tests out:
    https://people.xiph.org/~greg/opus/ha2011/
    http://listening-test.coresv.net/results.htm
    http://www.opus-codec.org/static/comparison/GoogleTest1.pdf
    http://www.opus-codec.org/static/comparison/GoogleTest2.pdf

    https://unrealaussies.com/data/aga/videotutorial/Combined.wav

    This WAV file example contains the original audio, a version in Opus at 16kbps and a version in MP3 at 24kbps. You can clearly hear that even at a measly 16kbps Opus still sounds OK and is far superior to the 50% higher bitrate MP3.

    Dolby Pro Logic 2 – DPL2 is a strange beast. It’s 2 channels encoded in a way that will sound great on a stereo system, but also have surround sound qualities on a 5.1 system. It has some secret data in the encoding that will help your 5.1 system extract sound out of the stereo part to add it to the rear or centre speakers with particular delays that make it sound like real surround sound. It’s not real however, it’s an illusion, but it’s a pretty decent one.


    Recommendation for DVDs:
    Converting AC3 5.1 channel into stereo often produces undesired effects. Sounds can be left out or appear to come through in an unintended way. DVDs will often include a 5.1 sound track AND a 2.0 stereo track, where the stereo has been mastered to sound the way the author wants it to sound (on 2 speakers). Using Handbrake, I recommend taking the AC3 5.1 sound track and down-mixing it into Dolby Pro Logic 2. The encoder will use half the bitrate for each channel and in that, a little can be taken aside to allow for surround effects. The LFE (subwoofer) channel will be dropped completely and your sub will just use the general soundwave. YOU DO LOSE QUALITY doing this (only really through speaker separation, not soundwaves) but the playback on a stereo system or headphones will be transparent, yet using a 5.1 surround sound system will be better than if you encoded a regular stereo stream, but not as good as the original. It’s a compromise in some situations, but not in others. 160Kbit/sec for DPL2 gives 80Kbit/sec to each channel when the original only had 64 so the extra 16Kbit/sec can be dedicated to adding the secret coding for surround effects (this isn’t really how it works, but it’s a decent way to think about it). It saves you 101MiB per hour of footage. If you are an “audiophile” and you want the original surround sound then by all means, passthrough the AC3 5.1 channel track, but remember, playback on a stereo system or headphones will sound bad and you’re doing this to prevent that, so you’ll have to include the stereo track which is another 128Kbit/sec so another 57MiB per hour. With 100 hours of footage you’re looking at 15.8GiB so make the decision yourself, but I personally do the DPL2 mixdown.


    Recommendation for OBS Recordings:
    Passthrough. Basically when you record in OBS I recommend Lossless video so that you can effectively encode it in Handbrake later, since Handbrake produces such better quality but it can’t be done in real time without using obscene amounts of space. The Audio however can be encoded quite easily since it’s considerably less complicated. So, OBS by default will encode into AAC 128Kbit/sec which if you look above, is the transparent rate for stereo AAC. UPDATE: OBS now has a default bitrate of 160kbps AAC which I recommend changing to 128kbps. Just pass that through in Handbrake and your audio will remain unchanged. If you’re clever, you’ll encode it in OBS to Opus at 112Kbit/sec and then pass it through Handbrake saving 7.2MiB per hour. But if you didn’t do that, still passthrough in Handbrake because your audio has already been recorded lossy at a rate JUST high enough to be transparent. If you convert it again then you could start to hear differences because you’re introducing 2 lossy iterations.


    Recommendation for Fraps Recordings: Encode to Opus 112Kbit/sec stereo. Fraps records your Audio in PCM just like a WAV file so you HAVE to convert it or it will cost you 640MiB per hour (It’s a bitrate of 1411.2kbps just for stereo)!!! If you have a surround sound setup and gameplay you MIGHT want to save it as DPL2 to keep some of that, or even 5.1 channel Opus at 336Kbit/sec to mimic DVD surround sound or possibly 256Kbit/sec since Opus will intelligently allocate less to the low-requiring LFE and more to the other channels as they need it. Either way, surround sound Opus at these bitrates will sound more like the original than Dolby Digital would. However playback on YouTube will be stereo for most viewers, or even mono for their phones so if YouTube is your goal then you’re pretty much wasting bandwidth. My recommendation, don’t stress about it. 112Kbit/sec stereo Opus sounds amazing. You CAN do FLAC for lossless purposes. I recommend FLAC if you’re going to be doing lots of editing, clipping and effects so that you don’t lose quality for every render, but your editing software needs to be able to handle FLAC. Regardless, the final version can be Opus and nobody has a right to hate this.


    C – Examples of the Handbrake Audio Tab.

    [​IMG]

    The above screenshot shows what the Audio Tab looks like when the source video was recorded with Fraps. In this example the source has one single audio track in an “Unknown” language. The format is Pulse Code Modulation (PCM) it’s in 16-bit, is signed (can be positive or negative) and in little endian (backwards to normal handwriting). These details are unimportant, all we need to know is that Handbrake recognises it, therefore Handbrake can convert it into another format. PCM cannot be passed through with Handbrake (because it would be stupid to do so).

    One detail that IS important is that the source audio is in 2.0 channels meaning stereo. You will only be able to mixdown to stereo or mono. Handbrake will not “invent” new channels.

    You have 3 main options:
    1. Encode lossless to FLAC. Change the codec to FLAC 16-bit. Don’t use 24-bit because the Audio rate will be 50% larger for no difference in soundwave and it also means you didn’t read about it above so you haven’t done your research. This is compatible on good media players and PCs, but often not car stereos or cheap/old home theatre systems. When you set the codec to FLAC, the bitrate part will vanish because FLAC is always lossless, bitrate changes to accomplish that. Usually you end up with 700kbps (about half a CD bitrate) or 308 MiBytes per hour.
    2. Encode to AAC 128kbps bitrate. 64kbps per channel is widely considered transparent for the avcodec AAC encoding (avcodec is Handbrake's AAC encoder of choice). Because this is stereo, you’ll need 128kbps. This will be most compatible with players since it’s Apple’s format and has been in use for yonkers. MP3 is just as compatible but requires more bitrate, so no reason to do MP3. 128kbps is 56.25 MiBytes per hour.
    3. RECOMMENDED Encode to Opus 112kpbs bitrate. Less compatible on players than AAC, but WILL become more and more widely used in the immediate future. Opus IS the future. 112kbps is just under 49.25 MiBytes per hour.


    [​IMG]

    This example above is what the Audio Tab looks like from one of my OBS recordings. The video was recorded lossless but is HUGE so I’m using Handbrake to remedy this. The audio however was encoded DURING the recording into AAC 128kbps stereo, which is the default setting for OBS.

    There is no point doing FLAC with this. The audio has been lossy encoded and lost quality. Putting it into FLAC now will save that lost quality in a track using 5 to 6 times more data and you gain nothing.

    You have 2 main options:
    1. Encode to Opus 112kbps bitrate. You lose some quality because you're lossy encoding for a second time on the original audio. You also lose compatibility on some players. But it will PROBABLY still sound exactly the same and will use about 12.5% less data on your disk at just under 49.25 MiBytes per hour.
    2. RECOMMENDED Auto Passthru. Set this in the codec drop-down and your AAC 128kpbs audio track will be copied bit for bit into the final video compressed file. In other words, it remains unchanged, so you won’t lose any more audio quality than you did when you first recorded it in OBS. You’re looking at 56.25 MiBytes per hour and to me it sounds identical to when I played the game.


    [​IMG]

    This time the example is from a DVD. I like to archive my disks in H265 so I don’t have bookcases full of them and I can play one straight away on my TV with a media player without having to find it on the shelf or worry about damaging and losing them.

    You’ll notice the source disk could have many audio tracks in multiple languages. This one has 2 in English. One is 5.1 channel AC3 (Dolby Digital) designed for home theatre systems that have 5 speakers and a subwoofer. The other is 2.0 channel AC3 (Dolby Surround or just stereo) and it is REMASTERED to sound the way the author wants it to on 2 speakers. It’s not the same as just the left and right of the 5.1 version, it’s actually a little different. Playing 5.1 on a stereo system can sometimes sound a little awful, so players will recognise that they can only play stereo and therefore choose the 2.0 channel track to play.

    You have 3 main options:
    1. Encode to Opus 112kbps bitrate stereo only. If you really don’t give much of a crap about the sound quality of this DVD you can delete the second audio track and just leave a single one from the 2.0 stereo source. Encode this with Opus at 112kbps mixdown Stereo or Dolby Surround or Dolby Pro Logic II and you’re good to go. Play it on a 5.1 home theatre system and the rear speakers might just play their front speaker counterpart’s audio, or they might not, while the sub just guesses what to do. Your final file will have only one audio track and it will be under 49.25 MiBytes per hour.
    2. Passthru the surround sound and encode a second stereo track. If you’re fussy about the sound you can passthrough the AC3 5.1 channel audio track so your final file will match the original DVD on a 5.1 home theatre system. These are often 384-448kbps or 167-197 MiBytes per hour for all 6 channels. As stated a couple of times above, if you play this back on a stereo system it could be crap and you might miss sounds, so since you care so much about perfect audio, you should keep a stereo track for playback on stereo systems. Considering this is for low-end systems, Opus might be incompatible so forget about that. You can passthrough the stereo track for another 192kpbs or 84.4 MiBytes per hour, combined with the first track making 576-640kpbs or 253-281 MiBytes per hour. Or you can say you care about the surround quality but not the stereo quality, you’re only keeping it for compatibility with low-end systems. In this case take the 2.0 stereo track from the source and encode it to AAC 128kbps or 56.25 MiBytes per hour, combined with track 1 to make a total of 512-576kbps or 225-253 MiBytes per hour.
    3. RECOMMENDED Mixdown to Dolby Pro Logic II in Opus 160kbps. Option 1 sounds perfect on stereo but has no surround sound capability @ 49.25 MiBytes per hour. Option 2 will sound perfect on surround sound and great to perfect on stereo but uses between 225-281 MiBytes per hour. If you make the final file contain only 1 track and use the 5.1 channel AC3 source but mixdown to Dolby Pro Logic II with a bitrate of 160kbps you have something else entirely. It will capture all of the surround sound waves, yet not be remastered specifically for stereo playback nor surround sound playback, but rather a combination of both. Playback on a stereo system will sound NEARLY as good as option 1 and 2, possibly transparent for most DVDs to most people. Playback on a surround sound system will sound NEARLY as good as option 2, also possibly transparent for most DVDs to most people. You end up with a single track that is stereo, with a little extra data to give a good home theatre system some information about how to turn that into 5.1 surround sound. AAC can do this, but Opus is much better at it with more intelligent bit allocation for sound separation vs quality. Your system will need to play Opus, but it will sound fantastic no matter the speaker setup you have and still only 160kbps or 70.3 MiBytes per hour. My media player cost me $80 at the end of 2016, it plays these files and it sounds fantastic.


    NOTE: If your DVD has only 1 audio track in the source and it’s 5.1 but you want option 1, then just use the 5.1 source at 112kbps Opus and mixdown to Dolby Pro Logic II. It means that they didn’t remaster it specifically for stereo systems but you can capture all the sounds by downmixing. If you wanted to do Option 2, you will only need one track, just passthrough the AC3 or DTS. Also note that if you see DTS, it’s conceptually the same as AC3 and all the same principles apply, just that often some people think DTS originals sound better than AC3s.

    So in the end, there’s a whole bunch of different ways DVDs will have their audio, affecting how you might decide to make your final file. But ultimately when you settle on the quality you’re happy with, you’ll worry about it less and less. Blu-Rays will have slightly different codecs too, but your end result will be the same. Generally speaking, if you want the original audio, passthrough. But if you’re happy with a slight, possibly unnoticeable quality drop, then you can encode to save a lot of space. Audio has come a long way, when DVDs and Blu-Rays were invented audio needed a lot of space to be truly spectacular. Nowadays, if your media player can play Opus in Dolby Pro Logic II, you can make massive savings. If there was another disk format being released this year to supersede Blu-Ray, it would surely allow Opus. In a way there is, YouTube and Netflix are both adopting it.

    When it comes to the recordings from OBS or Fraps, they are much more straightforward as you can see!
     
    #6 Agamemnus, Feb 14, 2017
    Last edited: Feb 14, 2017
    • Like Like x 1
  7. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,528
    7 – Streaming basics and a comparison – CPU vs NVENC vs Quick Sync.


    As stated in part 2, CPU encoding will, at the time of writing this, get the most quality possible into your bandwidth. For years people have claimed that hardware encoders can do all sorts of things, but the simple fact today is that $ for $, a CPU will do a better job. Other methods come in handy if your CPU is crap. Let’s say you’ve had a computer for 4 years and you recently managed to save enough cash to get a shiny new GPU, then you may benefit from part 8 of this tutorial (Quick Sync and NVENC) and applying some of those settings to your streaming. However if your CPU came out within 3 years of your graphics card any year before or including 2017, then it’s most likely that using your CPU will get you the best results. See the end of this section for examples of bitrates with different hardware.

    Step 1 is to go to speedtest.net and test your upload speed. Try close any programs that are using the internet first, so that you get the best result you can. In case you didn’t know, you should also do this before you stream….. What you want is a ping below 50, a download above 8Mbps and upload above 0.8Mbps. You may be able to stream with a slower upload, but your quality will really start to suffer. So if it’s low, stick to games like Hearthstone and avoid games like Unreal Tournament.

    You need to save about 50Kbps upload for your gaming and another 50kpbs for voice comms. Then you should also allow 10% of you max in fluctuation. So whatever your upload was on speedtest, take 10% off then another 100 off that for the number you use as your bitrate in OBS. For example, if you had 1.0Mbps upload on speedtest, then you would take 10% off for 900kbps and then another 100 off for a final figure of 800kbps. Bear in mind this is a general rule, the game you play and your specific connection may require you to take more off your “maximum” upload.

    Resolution, upload and download are a mixed bag of situational appropriateness. What I will say, is that YouTube can take any upload speed you have up to 6Mbps and will send it to all viewers at whatever download speed they can take. Twitch is less reliable, you may be able to upload at 3000kbps yet many users, like me, have a download speed of 8000kbps but still get constant buffering on Twitch streams. It’s because Twitch’s stream speed isn’t as consistent as YouTube’s. I recommend if you are in Australia and looking at an Australian audience mostly, keep it to 2000kbps as of the year 2017. If that changes, I’ll update. But for now, YouTube is more reliable to watch, but Twitch is more popular, so you should take more care with it.

    Upload speed:
    • 600-800 do it in 480p.
    • 800-1500 try 720p.
    • 1500-2000 go for 1080p.
    • If you are struggling just a touch at 30fps, turn it down to 25, which requires less data that will now be used for resolution quality and connection reliability.

    Open up OBS and hit the settings button.


    Audio, push-to-talk and hotkeys Settings:

    You will not need to touch the Advanced settings, but I recommend going to the Audio settings and ticking “push-to-talk” for your Mic, then go to Hotkeys settings and set a push-to-talk button for the Mic. This means that it will only record your Mic to the stream/recording if you have this button down. So if you cough or your girlfriend starts yelling at you during the game, it won’t come through without you realising. In the Audio settings, leave the sample rate at 44.1KHz. If you haven’t read above about why higher isn’t necessary then just take it on faith. Please. Feel free to have stereo channels though, that definitely makes a difference to the game audio. You can set different Audio tracks so that they can be edited separately later on, but it’s usually not necessary. I don’t do it.


    Stream Settings:

    This is going to depend entirely on where you stream to. Basically select your service, choose the location nearest to you and pop in your stream key. For YouTube the server won’t matter, for Twitch it will, others may or may not matter too. This is really up to you and your streaming service.


    Video Settings:

    [​IMG]

    Base Resolution = Set to your native screen resolution, or the resolution of the game you want to capture. Generally these days this will be 1920x1080

    Output Resolution = This is if you want your output scaled differently. The stream settings allow you to set this for the stream, and recording settings will allow you to set it differently for local recordings. Because of those features, I recommend leave the Output Resolution the same as your Base Resolution.

    Downscale Filter = Lanczos. If your CPU can’t handle this then you shouldn’t be doing it at all. Most likely it will though so don’t stress. If your final stream or recording is different to your game’s actual resolution, it needs to be shrunk. This is the process OBS will do it with. In my experience, if the difference between Lanczos and Bicubic is 5% of your CPU, that 5% is better used here than it is in your encoding process. If you are recording to HDD for upload later, then you’ll probably be doing it all in full resolution anyway and this won’t even get used.

    FPS Value = There’s a few ways to set this via the drop-down, but I always use “Integer FPS Value” and set the number to whatever I want. I recommend 25 for most situations, then once you’ve got the hang of stuff and want to push your recordings further, you can experiment.

    Disable Aero = Tick this box. It gives you a performance boost that is too good to ignore. This will help keep your gameplay at a good level and the recording will get extra CPU work for a better video quality.


    Output Settings:

    [​IMG]

    NOTE: In the latest version of OBS it sets the default Audio bitrate to 160kbps. I recommend flipping to the Audio Tab and changing it to 128 since that’s widely accepted as transparent for AAC and those extra 32kbps could be better used on your video quality.

    First off, change the Output Mode to Advanced.

    Now you should be on the Streaming Tab.

    Encoder = Your CPU is the best encoder in your computer unless you have a system setup that is extraordinarily strange and I’ll bet good money that you don’t. Set the Encoder to x264 then proceed to section 8 for help on how to tweak it. For people who absolutely must do it differently, you can choose NVENC or Quick Sync and look in section 8 for some examples with them. Honestly, for the same quality they use a CONSIDERABLY higher bitrate. The same applies in reverse, if you are limited to a bitrate, then the quality will be CONSIDERABLY lower.

    Enforce streaming service encoder settings = This can help if you have compatibility issues, but I manage just fine without it ticked.

    Rescale Output = This is where you scale your streaming output. It will do so proportionally to the OUTPUT resolution from the Video settings section. That’s why I leave them the same there, so you can be specific once you get to this tab. Look above at the speedtest section for what resolution to use for your bitrate.

    The rest will be in Section 8 because it is specific to your Encoder selection. So to help you choose I’ve got some examples of what each one looks like at 3 different bitrates. Please bear in mind that even if you use a different bitrate, the quality difference between encoders will be the same. If you look at the difference between Quick Sync, NVENC and x264 at 4Mbps, 2Mbps and 1Mbps and notice that one looks consistently better than the other, yet you can only upload at 800kbps or 1.5Mbps then you can safely assume that the same quality difference will apply.

    The video is a 5.36 second (134 frames) clip of Tracer in Overwatch doing her Highlight Intro called Lion Dance. The lossless version is 53.5MiB at an overall bit rate of 83.7Mbps. The audio is 44.1KHz AAC at 128kbps to reflect the real-world likelihood that some of your upload rate is used for audio and in all encode examples the audio is passed through and unchanged, therefore only the video changes.

    The image shows a frame where Tracer is spinning around. It’s high-motion on tracer but the rest of the frame is pretty stationary. NVENC and Quick Sync were unable to produce 1080p video at 1Mbps, for QSV the file was identical to the 2Mbps version and for NVENC it was a tiny bit larger which is stupid. So here they are in order of bitrate then quality:

    Original
    [​IMG]

    x264 4Mbps Medium
    [​IMG]

    x264 4Mbps Faster
    [​IMG]

    x264 4Mbps SFast
    [​IMG]

    Quick Sync 4Mbps
    [​IMG]

    NVENC 4Mbps
    [​IMG]

    x264 2Mbps Medium
    [​IMG]

    x264 2Mbps Faster
    [​IMG]

    x264 2Mbps SFast
    [​IMG]

    Quick Sync 2Mbps
    [​IMG]

    NVENC 2Mbps
    [​IMG]

    x264 1Mbps Medium
    [​IMG]

    x264 1Mbps Faster
    [​IMG]

    x264 1Mbps SFast
    [​IMG]
     
    #7 Agamemnus, Feb 14, 2017
    Last edited: Feb 14, 2017
    • Like Like x 1
  8. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,528
    8 – Streaming examples for CPU, NVENC & Quick Sync.


    CPU (x264):

    [​IMG]

    Rate Control = CBR. We do things differently for recording than we do for streaming. In recording we worry less about bitrate more about quality. But now with streaming, we want to set our bitrate and get the best quality we can out of that. With CBR, we will use the same bitrate constantly and when we can get a perfect image from that we will, when we can’t then we just get the best we can manage with the bitrate.

    Bitrate = This you should have selected from your speedtest.net result in section 7.

    Custom Buffer Size = I normally leave this unticked, but it can be tweaked. For bitrates between 1000 and 2000 you can set this to 1000 for some extra quality when it takes a hit, but the drawback is that users might need to buffer your stream when this happens. The bitrate plus your buffer size should be your maximum upload rate calculated in section 7, but if your bitrate is under 800 then just leave it unchecked. If that causes you problems, then tick it and set the buffer to something tiny like 50 and reduce your regular bitrate by the same amount to stay under your cap.

    Keyframe Interval = This is something that your streaming service might ask you to set. Longer keyframe intervals of say 10 seconds mean you can get better compression (more quality into your bitrate) but what it means for the viewer, is that if they start watching your stream, or when they skip forward/backward (in a VoD or some streaming services allow this) they will skip to the LAST 10 second mark, because that was your keyframe interval. Setting a shorter keyframe interval makes the viewer feel like your video is more responsive, but your quality per bitrate will suffer. I recommend leaving it on Automatic, but you can tinker with this depending on the game you’re playing or the purposes of your stream. If it’s a long HotS match then maybe it doesn’t matter, but if it’s a tutorial on a complex grand strategy game then the audience may appreciate the ability to skip around.

    Profile & Tune = Leave as None unless your streaming service asks you to set it specifically. It can affect playback on certain devices, but that’s more important for saved video files than it is for streaming. If a crappy phone is streaming something it can’t handle, then they should turn down the resolution instead of you accommodating them by implementing a quality hit to the people who have the proper equipment. Recorded videos compress much better (by being slower) so you can accommodate more viewing equipment, but when streaming we are often already using non-perfect quality due to bandwidth restrictions.

    Variable Framerate = Leave off. I’m not sure why this would help anybody for streaming purposes unless you’re trying to manage your total internet usage per month.

    x264 Options = Leave this blank. If you have a crappy CPU then you might benefit from putting “opencl=true” in here to take some load off your CPU with your graphics card. But in the majority of cases a CPU can handle that part faster itself than it can by sending it to the graphics card, waiting, then reconstructing data after it gets the results back. You will have to do some testing to see if that works for you. On a more advanced level, this is where you can change specific parameters from the Preset you’re using to try get a little more quality into your bitrate without jumping all the way to the next slowest preset. Use this link to get it right http://dev.beandog.org/x264_preset_reference.html

    CPU Usage Preset = This is the important part to get right once you have your bitrate set. In this example I’ve set it to Medium which is reasonably ambitious for a 2017 CPU. The SLOWER the preset the better your quality per bitrate, because your CPU is working harder to fit more quality in. However, if your CPU cannot keep up with the game’s framerate, then it will drop frames. Dropped frames don’t just make the final stream look choppy, but they also waste processing time, since sometimes the frames that are dropped have had some work done on them. So dropping 5% of frames is worse for your final quality than using a preset that is 5% less on your CPU. The moral of the story is that if you get dropped frames or your CPU maxes out, you need to choose a faster preset and wear the lower quality.

    What you want to do is pick a preset and test it. OBS will tell you down the bottom of the main screen how much CPU it is using. If that goes over 80% you could be in trouble, because your game still needs some CPU to run and you don’t want OBS or your game to get less than they need. Different games have different requirements though, so maybe open up your Task Manager or Resource Monitor and have a CPU graph up. Play the game while streaming for a minute then ALT-Tab back and see what the graph looks like. If it hit 100% at any time, then you need a faster preset because either your game will suffer or OBS will drop frames. If it stays below 60% then you may be able to turn the quality up a little by setting a slower preset for some more quality. If you find that you get 60% usage on one preset but max out on the next slowest, then you can use the “x264 Options” part to increase a parameter or two. That kind of tinkering is beyond the scope of this guide and probably not for you if you need to read a guide in the first place. But when you become more accustomed to this sort of thing, it’s there if you want.

    Finding the right balance can be tricky and you may need to spend an hour or two testing a couple of different games to get the hang of it. Ultimately, the main benefit of x264 CPU streaming is that you can tweak it to get the best out of your equipment and internet connection.


    Quick Sync (QSV):

    [​IMG]

    Depending on your CPU model and therefore your version of Quick Sync, your screen may look slightly different to this. Just use this as a guide, if some options are different you can look them up if you need to but many of them won’t need adjusting.

    Target Usage = Quality. Performance is for people who want to quickly encode heaps of files at massive bitrates in one hour instead of two. We’re streaming now, we have a constant bitrate and we don’t need the encoder to do it FASTER, we need it to do it BETTER.

    Profile = High. This isn’t theoretically required, but current Quick Sync versions need to know target parameters so just tell it High. That will allow Quick Sync to do its best work.

    Keyframe Interval = 3. Another example of where the hardware nature of Quick Sync requires a parameter that software like x264 can work around. I recommend 3 seconds, but you may be able to adjust it depending on your Quick Sync version, or it might not let you.

    Async Depth = 4. This won’t affect you for the purposes of this guide. Just leave it.

    Rate Control = CBR. For the same reasons stated in the x264 part above. Other methods come in useful for recording to HDD, but for streaming, this is what you want.

    Bitrate = This you should have selected from your speedtest.net result in section 7.


    NVENC:

    [​IMG]

    Again this may look different depending on your Graphics Card and therefore your version of NVENC.

    Rate Control = CBR.

    Bitrate = This you should have selected from your speedtest.net result in section 7.

    Keyframe Interval = NVENC lets me leave this at 0 (auto) but it might not let you. Choose 0 if you can or if you can’t, try 4 or 3.

    Preset = Bluray. If you don’t have the Blu-Ray option then your card is probably too old to be trying to stream with it. Other feasible options are “High Quality” or “Low-Latency High Quality” but in my experience Blu-Ray does the best job.

    Profile = High, to allow NVENC to use as much parameter flexibility as it can.

    Level = 5.1 for the same reason as High Profile above. At the time of writing this OBS only supports up to Level 5.1 but if it ever supports higher than that, I wouldn’t go with it, just stick with 5.1. Read section 3-A if you want to know why.

    Use Two-Pass Encoding = Yes, tick this box if it’s available to you.

    GPU = 0. This is for if you have multiple cards you can specify which one does the work. If you only have one, then number 0 it is.

    B-frames = As high as you can make it up to 16. Mine only goes up to 4.
     
    #8 Agamemnus, Feb 14, 2017
    Last edited: Feb 15, 2017
    • Like Like x 1
  9. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,528
    9 – Recording examples for CPU, NVENC & Quick Sync and recording while streaming.


    This is all about recording to your HDD for keeping or maybe sharing later. You no longer have the bitrate concerns of Streaming. Ideally you will record lossless video with transparent audio, so that you can use Handbrake later to encode your video into a more manageable size and passthrough the audio. Sometimes this isn’t possible however, so I’ve provided recommendations on what to do if you can’t manage lossless. It’s not as perfect, but it’s so good I believe most people would really struggle to notice any differences.

    The other thing you can do is record to disk WHILE YOU ARE STREAMING!!! The idea being that you stream live, but because the quality isn’t the best due to bitrate limitations, or you want to do some editing, you can simultaneously record to disk. This disk recording can be a completely different resolution too, so while you’re streaming in 720p you can still save to your disk in 1080p for upload to YouTube later.

    Also note that I recommend MKV files (which you can read about in section 3) and that you do NOT tick the Rescale Output box in this tab. The recording to disk should be done in the game’s resolution and not modified.

    For lossy versions, you’ll notice that I recommend 160000kbps for NVENC but only 100000kbps for Quick Sync. The truth is that later versions of NVENC support lossless which is great, but for versions that don’t, their lossy encoding is not as good as Quick Sync. Both numbers are very high, and you will most likely never see your actual final video using that kind of data rate. They are just caps, for those tiny fractions of time where the encoding needs huge data rates during things like high-motion. Don’t’ stress, neither version will give you that kind of absurd data rate for hours of footage. They are just precautionary.


    CPU (x264):

    If you are streaming, then you should be using your CPU to encode the video stream. If you are doing so, then maybe use Quick Sync or NVENC to record to your disk so that as much of your CPU can be put into streaming as possible. If however you don’t have NVENC or Quick Sync but you want to stream AND record, then you can set your CPU preset in the recording tab to UltraFast, so that as much power as possible can be used in the streaming tab. You will be exposed to more dropped frames though, so you need to test this and find your optimal preset for the Streaming tab.

    If you’re just recording to disk but not streaming, then this is definitely the way to go.

    [​IMG]

    The only option here that matters is the one labelled x264 Options (separated by space) where you insert “qp=0” in there and that means LOSSLESS. OBS will now ignore all other interfering parameters for the disk recording and encode lossless video. If you set CBR, that will do nothing. Bitrate will do nothing. Profile could possibly decrease file size if set to “high” but will probably do nothing. Tune you want set to “None” and variable framerate NOT ticked. Those last 2 probably also do nothing, but I can’t confirm that yet, so leave them off.

    The CPU preset however will do something. It determines how much work can go into making your recording lossless. It does however need to keep up, so if you set it to Placebo and have ANY CPU bought at the time of writing this, you will drop frames because it’s too hard to encode Placebo at 25 or 30 fps. Just set it to something that will not max out your CPU, test this if you have to. The SLOWER the preset, the smaller you lossless recording file will be. But you are going to encode it with Handbrake later anyway, so it’s OK if the lossless part is larger than it needs to be, you won’t be keeping it. Feel free to just put it on SuperFast and never worry about it, unless you’re streaming at the same time, in which case use UltraFast. If your CPU can’t handle either of those, then you might need to look at the Quick Sync or NVENC parts below.


    Quick Sync (QSV):

    This is trickier, since you can’t do lossless with Quick Sync at the time of writing this. These settings will try to get the best quality possible without regard to file size. You should still probably convert to a more manageable file size later with Handbrake, but doing so will introduce a second lossy encoding which means you’ll lose some quality. However, I’ve found that if the original recording is done at such high quality, this second-iteration-quality-loss isn’t noticeable in most cases, while the file size difference is well worth it. I have one recorded lossy at 8.5GiB for an hour which becomes 2.2GiB after Handbrake and still looks fantastic.

    [​IMG]

    Target Usage = Quality. Just like with streaming, we want to fit as much quality as we can. Quick Sync is so fast that performance would accomplish nothing for us here.

    Profile = High. Again not required as with streaming, but it could make a difference in rare cases.

    Keyframe Interval = 3. Your version of Quick Sync may allow changes here, but mine doesn’t. If you can set it to 0 then do it, otherwise 4 or 3 are fine.

    Async Depth = 4. Not relevant, just leave it.

    Rate Control = This is the important part. What you have available to you will depend on your CPU model and therefore your version of Quick Sync. Also, what you choose here will affect what other options do or don’t come up below. My recommendation is an order of preference going: LA ICQ -> ICQ -> CQP -> LA (VBR) -> AVBR -> VBR -> CBR. At the time of writing this, LA ICQ is the best.

    ICQ Quality = 20 or lower. Try 20 and after you do a Handbrake encode according to section 5 of this tutorial see if you can see any blurring or pixelating. If you do then you may benefit from a better original recording of 19 or even 18, but your file size will start to get extremely large. If you can spare the disk space then don’t stress, do whatever you need to do. In my opinion, 20 usually does the trick, but different games give different results.

    Lookahead Depth = 40. This isn’t overly important. As long as you can get 40, you’re set. It’s only applicable for some Rate Control types.

    QPI, QPP & QPB = 20. If you couldn’t use ICQ modes and had to go for CQP mode, then set these to 20. Turning it down to 19 or 18 will give you even better quality like in the ICQ paragraph above, but it’s probably not necessary. With CQP mode, your overall bitrate will be whatever is needed for the quality, but it will be capped at the bitrate you set below.

    Bitrate = 100000. If you can use ICQ mode then you won’t see this. But if you have to use CQP or a VBR mode then you’ll need something here. I haven’t thoroughly tested this, but my understanding is that Quick Sync at 80000kbps is phenominal quality. If you are forced to use CQP mode, then you can set your QPI, QPP & QPB numbers to 20 and the actual bitrate may average only 25000-30000kbps but by setting 80000 you’re telling it that if it absolutely needs to use more data, it may. If Quick Sync needs more than 80000 I’d be surprised. We’re setting it to 100000 just in case.


    NVENC:

    Some NVIDIA cards support lossless, others don’t. We will cover both.

    [​IMG]

    If you can set your Rate Control to Lossless, then do it and try to record something. If it works, then you’re set, don’t worry about the other settings. If however it doesn’t work, then it’s not supported by your Graphics Card and you should try the below….

    Rate Control = VBR. We’re going to allow such a high bitrate that the card will just encode pretty much all the quality.

    Bitrate = 160000. That’s 20MByte/sec file size but that’s OK, because you will be encoding with Handbrake later to reduce that. Like with the Quick Sync example above, this introduces 2 iterations of quality loss, but I find this is usually pretty good.

    Keyframe Interval = Auto or 0.

    Preset = Bluray.

    Profile = High.

    Level = 5.1 just like the other NVENC examples.

    Use Two-Pass Encoding = Tick it if you can!

    CPU = 0 unless you have multiple Graphics Cards.

    B-frames = As high as you can make it up to 16. Mine only goes up to 4.
     
    #9 Agamemnus, Feb 14, 2017
    Last edited: Feb 15, 2017
    • Like Like x 1
  10. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,528
    10 – Buffer recording and AVIDemux.


    Replay Buffer:

    *** Currently not completely implemented. Will update when new version comes out. Currently only available in simple mode lossy. Very sad face.

    Buffer recording is where you don’t record EVERYTHING that happens, but rather just certain parts. Like a highlight reel without the editing. It’s where you have OBS running with a buffer, for this example it will be 30 seconds, and if something happens that you want to save, you can save the last 30 seconds to a file.

    The advantage of this is that you can play for 6 hours straight and only save 2 or 3 short blocks of video which you know are good plays. Lossless 1080p @ 30 FPS will run around 10MiB/sec so 6 hours would require over 200GB on your HDD. Then you have to watch the video and find the good bits and edit them out. This way, you know you have your good bits already prepared and lined up in neat little files. You can merge them together then encode it as one large highlight reel. What I do is I encode them FIRST, so that I have encoded copies that only take up 3GiB per hour instead of 36GiB per hour. Then I check the encoded copies to make sure they worked and look fantastic, and I delete the lossless originals and merge the new files into highlight reels. On the Unreal Aussies YouTube page there’s plenty of League of Legends and Dota 2 highlight reels along with some from Shootmania and The Hidden: Source.

    Open up OBS and make sure you’ve setup your output-recording tab to record lossless or near to it. See section 8 of this tutorial for more information. Then head to the settings Hotkeys section.
    • Start Replay Buffer -> This button you push to begin the buffer. It will buffer up to the number of seconds you have listed under “Replay Buffer length” in the Broadcast Settings screen.
    • Start Recording -> Push this button to start a video file which begins from when you pushed START Replay Buffer, or from how many seconds ago you have setup in your buffer length.
    • Stop Recording -> End the recording and save the file.

    Now, if you START Replay Buffer at the 1:07 mark and get to 22:51 and realise something amazing just happened, you can hit Start Recording. This will start a video file which begins at the 22:21 mark, so your amazing moment has been saved. Feel free at this point to make a witty comment or continue recording for a bit. When you’re ready, hit STOP Recording to finish the file. Now you can hit START Replay Buffer again to be prepared for your next amazing play!

    AVIDemux:

    Once you’ve got some footage saved up, it’s time to edit. Most people like Adobe Premier to mash video clips together and do some nice effects. I personally just take my 30 second blocks and merge them together, or if I record a tournament like the in-house Heroes of the Storm ones, I cut out the time in-between matches with AVIDemux. It simply creates one file and pastes all the clips you want together and can do so with no encoding nor converting. It just turns short files into a long one. There is no quality loss, but the end result is nowhere near as extravagant as something more professional.

    One thing to note is that AVIDemux will only merge files that are encoded with EXACTLY THE SAME PARAMETERS. This means they need to have been done with the same profile and level, audio format and bitrate, CQ or VBR mode. I have my settings in Handbrake saved as defaults for different situations. Also, the “uA Intro” clip that plays at the start, I have encoded in Handbrake multiple times to match each of these settings. There’s the old way I did it with Fraps recordings, another way for highlight reels and a final way for tournaments like in-house nights (because they are so long I encode with faster settings). I open AVIDemux and add the appropriate “uA Intro” then proceed to tack on all the rest.

    Get AVIDemux from here https://sourceforge.net/projects/avidemux/

    Open it up. The easiest way to do this is to have a folder of your videos open next to the AVIDemux window. When you drag one over and drop it onto the AVIDemux window, it will open the file in the program. Drag the first file over and wait for it to process, then drag the next one and it will add them together into one big movie. You can keep adding more and more clips this way.

    [​IMG]

    Over on the left you want the Video Output AND the Audio Output to say “Copy”. This is the feature of AVIDemux that convinced me to use it. It means that the bitstreams for video and audio will be copied bit for bit, like passthrough audio in Handbrake. Not many programs can do this, Premiere can’t for example, at least not the version I own. If you just wanted to merge your highlights into a reel then you’re good to go. Just save the video as something and it will only take a few seconds depending on how fast your hard disk is.

    If you want to cut bits out of the video then it’s a little trickier. The way H264 and H265 both work, is that they use an “I-Frame” as a picture, like a JPEG which requires no other frames to work. Then they use “P-Frames” which are stored as the difference between a previous frame and they’re not an image on their own. There’s also “B-Frames” which are the real magic, they can use a previous frame AND a future frame and store the middle-ground/weighted-average or however you want to think about it. P-Frames and B-Frames depend on other frames to give an image, so you can’t START a video with one of those, you need to start it with an I-Frame. You heard a lot about Keyframe Interval in the streaming section of this guide? Well I-Frames are the Keyframes.

    So you need the END of a cut to be on an I-Frame. Down the bottom of AVIDemux it tells you what type of frame you are looking at. In the image above it’s a “B-FRM” and you can see it just next to the timer. Move the slider to exactly where you want the cut to start and click the red box button with an “A” in it. This is the start of your selection. Then do the same for your END marker, move the slider but hit the button with the “B” in it right next to the last one. This point needs to be an I-Frame, if not, then the video/audio “copy mode” we want can create errors. To end a cut at a point other than an I-Frame means the whole video will need to be encoded lossy again, losing quality. This may be what turns you off AVIDemux because if you’re going to do a whole recode, then why not use Premiere with its larger number of features? Well for me, I manage just fine with AVIDemux and setting the END point for a cut to an I-Frame usually only means a couple of seconds early. For my purposes this suits and I recommend it.

    To find an I-Frame just use the blue circle buttons with the double arrows in them, like the fast-forward or rewind buttons on a remote control. These will skip to the next I-Frame in that direction. Remember, you can START your cut on most frames, but it needs to END on an I-Frame. If you’ve messed it up, AVIDemux will tell you about it when you try to save the file. When you’ve got your selection, just hit the delete button on your keyboard and it will be removed. If you save while there’s a selection, it will save JUST the selection. Once you’ve deleted it, the selection disappears and you can save the result.
     
    #10 Agamemnus, Feb 14, 2017
    Last edited: Feb 15, 2017
    • Like Like x 1
  11. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,528
    Update for AV1. This guy I think from Mozilla was in Hobart for the linux.conf.au just this January and gave a nice summary speech about AV1 and how it came about.

    FYI the stock release version was released in the first week of April as version 0.1.0 and will undergo adaptation until the end of the year. There is an encoder available but no easy decoder that I can see, so you can't play the video you make anyway LOL

     
  12. ibbanez

    ibbanez New Member

    Joined:
    Jun 24, 2017
    Unreal Credits:
    102
    I wanted to say that I thought this was a great write up and probably one of the most informative posts I've read on the subject. Great work and I love the site. Now, I do have a question. So, I record everything in 4k60 and have started uploading to youtube. But since this is in the US, and almost all ISP's impose a bandwidth cap. I have 2TB, but I have 3 girls and we cut the cord, so they watch Netflix, Hulu, Youtube, etc all day. Since I started a channel for me and my daughter to make game plays, and other things. Well Im uploading full game play throughs and its hours and hours of game play. Now, previously, I was re-encoding the 4k footage down to 53-68 variable bitrate, but with how long the videos are and how frequent they are, the file size is just too much to upload everything in 4k to youtube. Do you have any thoughts on what I could do to reduce the file size, drastically? Since its youtube, Im trying to find a compromise between file size and quality, vs me reencoding down to 1080 @ 16500 Mbps on Medium with x264 and a few other settings. Thanks.
     
  13. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,528
    Thanks!

    Yeah man there's a few things you can try:
    1. First of all, your overall strategy might be the thing you can improve most. If you're encoding from the game straight into an encoded file then uploading that file to YouTube, then that 2-step process is holding you back. It means you need to encode live, so the amount of compression you can put into the encoding is minimal, since your computer is trying to keep up with the live gameplay. Instead, you should use OBS (or fraps if you want, but the OBS settings are in the guide above) and encode lossless. If you can't encode lossless, then try the NVENC settings shown at a massive bitrate. Once this lossless file is on your hard drive, use Handbrake to encode QRF 20 or 21. See the Handbrake section for some tips, but generally speaking, the longer it takes your PC to encode, the smaller the file will be at a given QRF. Compare this to one of your existing videos, it might look better, in which case you can increase the QRF number a bit and get an even smaller file size. Once you've done this and you have some settings you're happy with, delete the original lossless recording and upload your new super-compressed version to YouTube.
    2. Another thing I recommend is to encode at a Constant Quality RF instead of by bitrate. From your post it sounds like you're using bitrates. Constant Quality RF will use however many bits it takes to get the quality you determine, whereas using bitrate, you sometimes spend more bits than you need to, and sometimes not enough, so the video still looks poor at some points. With set bitrate, you're wasting some bits and still see things that make you think you need to turn up the quality. QRF of 20 is great quality, but you might be able to drop it to 21 or 22 (higher number is worse quality) and it might still look fine. Get a short clip and compare those 3 with what you've got and see if the total file size is different.
    3. 4K @ 60fps should look absolutely fantastic with a QRF of 21 and the total filesize should work out to be around 50Mbit/sec. If you are happy with a QRF of 22 or 23 then your final bitrate should drop down to 40 or so, which is a 20% saving. If not, then you may want to look at some changes to how you record. YouTube will accept 50 fps recordings which will save you over 10% bitrate (not quite 1/6, more like 1/8) in the end product but wont look any different. Some people say that 60 fps looks better than 50, but honestly, for viewing, the difference is really so tiny I wouldn't worry about it. Try it out and see if it makes a difference for you. If you're recording shooters like Overwatch or Counter Strike then 50 is as low as you can go before people will start noticing. However if you're recording Heroes of the Storm, League of Legends, Minecraft, Warcraft or anything else that isn't as fast paced as Overwatch, then you can easily drop your frame rate to 30. If you do QRF 22 and a framerate of 50 AND use the 3 step process of OBS Lossless -> Handbrake -> Youtube you should see around 36-37 Mbit/sec from Very Slow encoding, but mind you, your PC will have to encode night and day to achieve this!
    4. Finally, there's one more tiny little buff you can get to the bitrate. I don't know what audio codec and bitrate you're using, but the lowest possible setting you can use while still sounding transparent to over 90% of people is Opus codec at 56Kbit/sec per channel. So for stereo, 112 Kbit/sec. If you're using AAC at 256 then you're wasting bandwidth. YouTube will happily accept Opus audio, in fact, they already convert their most popular videos to this (even if they uploaded in AAC) just to save bandwidth. It may seem like it's not worth it to reduce your bandwidth by just a couple hundred Kbit/sec, but if you're uploading hours and hours every day, which it sounds like you are, then let's look at this realistically. If your current audio setting is 256 AAC for audio and you change it to 112 Opus, then you're saving just over half a GiB per hour of footage. It could be worth it! In terms of maintaining the perfect quality, you need to either record your original gameplay to your HDD in Opus straight from the game (which I still haven't quite got right in OBS) or you record lossless audio like PCM in Fraps or a SUPER high bitrate AAC in OBS and then convert to Opus in Handbrake when you convert your video.

    In summary, the main things to get the best possible bitrate for the quality you want is to record lossless first, then Handbrake encode at a QRF that you are happy with, and use a preset (preferably slower than Medium) that is as slow as you can bear. Tweaking your audio and framerate are just little bits extra you can squeeze.
     
    #13 Agamemnus, Jun 30, 2017
    Last edited: Jun 30, 2017
  14. ibbanez

    ibbanez New Member

    Joined:
    Jun 24, 2017
    Unreal Credits:
    102
    Hello and thanks for the response. I think I didn't communicate clearly in my first post. I currently am recording the 4k60fps footage in lossless and then re-encoding down to x264 medium @ 53Mbit. I just didn't know if there was some super secret way to heavily compress everything to shrink the file size to something like a 30Mbit file size would be. I didn't think about the 50 fps and that may be something to check out. Also, I did read the section on the Opus, but have yet to implement it. Currently Im recording AAC 320kbps ;p Honestly, I love the look of my 4k footage at 63Mbit, but for full on game play, I'd never have enough bandwidth each month cuz our Providers suck in the States :( Most of what you suggested I've been doing with OBS already. I really hope AV1 turns out nice and youtube accepts it and the file sizes are shrunk like hvec. Thanks and I love the site.
     
  15. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,528
    OK yeah that's a bit clearer. It's good that you're doing it to your HDD first then encoding, that's the main thing that gets bitrate down. You're still left with 4 more things you can do:
    1. Don't encode with a set bitrate, constant/variable/average they are all worse than Constant Quality. Switch to Constant Quality RF and give it a try. Imagine a video that is 2 seconds long, the first second requires only 20Mbit to look fantastic, but the next second requires 70Mbit to look fantastic. If you encode at 53Mbit you're getting a fantastic first half, but the next half doesn't quite look fantastic. With Constant Quality set to fantastic, your first second will only use 20Mbit and the second second will use 70Mbit. It will average out to 45Mbit/sec over the whole clip and the whole thing looked great, whereas setting 53Mbit/sec will always try to average out to 53Mbit/sec and parts of the video still don't look good. It's just wasting bits. The hardest part with using Constant Quality is finding the RF number that you want. There is no formula to make the Constant Quality equal to a set bitrate, you have to do it experimentally. Try make a 30 second recording in lossless, then convert it using Constant Quality RF 20, then again at 21, then again at 22 and look at the results. See if they are different, see if the file size is smaller or larger. Then maybe try it with some new RF numbers. You will find the sweet spot. I'll test out one of my own and see what I come up with.
    2. Try it at 50 fps. Because higher frame rates mean more frames are very similar to the ones around them, they compress quite well. Cutting 1/6th of your fps wont save you 1/6th of your bitrate, but you might save 1/8th or 1/10th of your bitrate. Again, try the same source clip twice at each framerate and see if you can tell the difference and check the file sizes.
    3. Use a slower preset. Because you've been doing this at set bitrates, the presets haven't changed the file size for you, they just change the quality. When you've set Constant Quality with an RF number it won't change quality with a slower preset, instead it will make the FILE SIZE smaller!!! Try your test clip at Medium preset with RF 22, then try it again with Slow preset at RF22, and again at Slower preset RF22. You'll see that they look identical, but the file size shrinks because your CPU has put more time and effort into the compression.
    4. You gotta change that audio man! If you are recording stereo then you can get transparent sound in AAC at just 128Kbit/sec. If it's 4 channel surround then 256Kbit/sec will do it, if it's 6.1 then 320 is probably where it's at. Are you recording in 6.1? If so, then you can change the audio in Handbrake when you re-encode down to 160Kbit/sec Opus but change the "mixdown" to "Dolby Pro Logic II". This records in 2 channels but with a multiplexed phase shifted addition that surround sound players will understand and use to produce surround sound. It sounds perfect on stereo headsets or speakers, and it sound NEARLY as good on a fully expensive epic surround sound setup lounge room. Think about your target audience, and think about the game you're recording and how much it really uses the surround sound, is it worth going any higher than that? I know it's only saving you 1/4 a Mbit/sec, but every bit counts.

    Trust me, make those 4 changes and you'll cut your actual file-size by as much as 20% and the quality will still be great. Like I said, I'll test some 4K videos tonight, but it will take me at least a day to get them processed and compare results. Check back here tomorrow.

    P.S. YouTube definitely WILL accept AV1, they were the lead pushers for the collaboration. They were developing VP10 to replace HEVC and realised that other companies were having good progress with Daala and Thor. They all joined forces to work on AV1 and Google has specifically stated that once the encoders are finalised, they will begin converting their ENTIRE YOUTUBE LIBRARY to AV1 and they will start within 3 months of release and their first targets will be the highest quality popular videos because they use the most of their bandwidth. Slowly but surely, the whole of YouTube and Netflix will all be changed to AV1. It's happening and it's all totally free. The drawback for us though, is that it will require considerably more processing power. If it's taking you an hour to encode an x264 video now, it will take you between 3-10 hours with AV1 and you might save 20%-50% of the bandwidth for the same quality depending on your source. Once it comes out, I will begin saving for a new PC LOL
     
    #15 Agamemnus, Jul 2, 2017
    Last edited: Jul 2, 2017
  16. ibbanez

    ibbanez New Member

    Joined:
    Jun 24, 2017
    Unreal Credits:
    102
    Luckily I just built a new system on the 8 core Ryzen r7. Now, once Threadripper comes out in a few months, Im hoping that the hit will be much more manageable. Thanks for the info.
     
  17. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,528
    OK So I did a bunch of encodes of the same video. I'm going to add these to the guide at some point because there's no 4k tests in there. All the videos looked good up to RF 24, at which point I could see some colour blending/blocking that was very very slight. So I didn't test any higher RF values.

    Resolution = 3840 x 2160
    Framerate = 60

    Encoding FPS approx for RF number and x264 preset. Gets a little faster as the QRF gets higher.
    20 Med = 8.4
    20 Slow = 4.8
    20 Slower = 3.7
    20 VSlow = 2.9
    21 Med = 8.5
    21 Slow = 5.0
    21 Slower = 4.0
    21 VSlow = 2.9

    Bitrate in Mbit/sec for final video.
    RF 20 Medium = 42.6
    RF 20 Slow = 42.6
    RF 20 Slower = 41.4
    RF 20 Very Slow = 38.4
    RF 21 Medium = 38.6
    RF 21 Slow = 38.5
    RF 21 Slower = 37.6
    RF 21 Very Slow = 34.7
    RF 22 Medium = 34.8
    RF 22 Slow = 34.8
    RF 22 Slower = 33.8
    RF 22 Very Slow = 31.2
    RF 23 Medium = 31.1
    RF 23 Slow = 31.2
    RF 23 Slower = 30.5
    RF 23 Very Slow = 27.8
    RF 24 Medium = 27.6
    RF 24 Slow = 27.7
    RF 24 Slower = 27.0
    RF 24 Very Slow = 24.3

    Encoder settings for RF24 Very Slow

    x264 core 148 r2708 86b7198
    Encoding settings : cabac=1 / ref=5 / deblock=1:0:0 / analyse=0x3:0x133 / me=umh / subme=10 / psy=1 / psy_rd=1.00:0.00 / mixed_ref=1 / me_range=24 / chroma_me=1 / trellis=2 / 8x8dct=1 / cqm=0 / deadzone=21,11 / fast_pskip=1 / chroma_qp_offset=-2 / threads=18 / lookahead_threads=3 / sliced_threads=0 / nr=0 / decimate=1 / interlaced=0 / bluray_compat=0 / constrained_intra=0 / bframes=8 / b_pyramid=2 / b_adapt=2 / b_bias=0 / direct=3 / weightb=1 / open_gop=0 / weightp=2 / keyint=240 / keyint_min=24 / scenecut=40 / intra_refresh=0 / rc_lookahead=60 / rc=crf / mbtree=1 / crf=24.0 / qcomp=0.60 / qpmin=0 / qpmax=69 / qpstep=4 / vbv_maxrate=300000 / vbv_bufsize=300000 / crf_max=0.0 / nal_hrd=none / filler=0 / ip_ratio=1.40 / aq=1:1.00


    So you can get down to some pretty low bitrates there mate. I recommend trying out RF 22 using Very Slow preset and give your Ryzen a real workout. If it's taking too long to encode and you can't swing that, try RF23 using Slower preset.
     
  18. ibbanez

    ibbanez New Member

    Joined:
    Jun 24, 2017
    Unreal Credits:
    102
    Thank you so much. If I can have the video looking great at those lower bit rates, that would be amazing. Currently, my play through isn't a super fast action game, so I have some other ideas of things to try. I'll let you know how these turn out. Thanks again...
     
  19. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,528
    YouTube now accepts H.265 as uploads, but it converts them to H.264 for the viewers. If you get lots of views, it will then also add a VP9 option. See below for an example of what was originally uploaded in H.265.

    Main posts have been updated to reflect this change.

     
  20. Agamemnus

    Agamemnus Administrator
    Unreal Officer

    Joined:
    Feb 17, 2016
    Unreal Credits:
    12,528
    Good news, I have 2 main updates pending:
    1. A test of AV1 that I did for an assignment at uni. It's full on, but I'll get it linked here at some point.
    2. Opus in OBS. I took the time to figure out how to use Opus in OBS. The thing that got me previously was the 44.1KHz thing. Turns out Opus has to be an even divider of 48KHz. Once that's set it will work. I'll have the guide updated before Pax.
     
    • Winner Winner x 1

Share This Page