Why are videos rendered by the cpu instead of the gpu?











up vote
8
down vote

favorite
3












Well I know this might sound like a pretty stupid question, but I couldn't find an answer using google, so yeah ...

So, I know there are techniques like OpenCL and CUDA, but why is, by default, the processor used to render e.g. a video file out of a video editing software? Seems counterintuitive to me that the Graphics processing unit is not used to process, well, graphics. When playing a video game, the GPU is in charge of producing the image on my screen as well, isn't it?



Again, I know this may sound stupid to you. Please be gentle °A°



Edit: I was talking specificaly about the video output of a NLE software like Premiere Pro










share|improve this question




















  • 3




    Most online video is compressed, so the real work is decompressing the file, which is general operation handled by the CPU. GPUs would likely actually be very good at handling file compression/decompression if the libraries supported it, but generally they get leveraged for specific calls to draw items on screen, not for general computation. Games take advantage as they're all dynamic, so they really leverage the GPU - video isn't generated on the fly - it's predefined this color pixel at this coordinate, so GPUs aren't useful for that.
    – ernie
    Jul 30 '14 at 21:40












  • Are you talking specifically about video editing software rendering a video file? If not, can you please offer some other examples? How much GPU a video editor uses while rendering depends immensely on the rendering software package being used. As-is this is too broad (IMO).
    – Ƭᴇcʜιᴇ007
    Jul 30 '14 at 22:48












  • Yes, I was talking about that. Sorry if that was unclear.
    – MoritzLost
    Jul 30 '14 at 22:50










  • Then please edit your question to be about the specific problem you're facing, include the software you've tried, what OS, and what you've attempted to correct it. Then ask a specific question about that specific problem. Again, as-is this is just too-broad, as many video rendering packages can use the GPU, if they're configured properly. Then we may be able to help you figure out why yours isn't. :)
    – Ƭᴇcʜιᴇ007
    Jul 30 '14 at 22:53

















up vote
8
down vote

favorite
3












Well I know this might sound like a pretty stupid question, but I couldn't find an answer using google, so yeah ...

So, I know there are techniques like OpenCL and CUDA, but why is, by default, the processor used to render e.g. a video file out of a video editing software? Seems counterintuitive to me that the Graphics processing unit is not used to process, well, graphics. When playing a video game, the GPU is in charge of producing the image on my screen as well, isn't it?



Again, I know this may sound stupid to you. Please be gentle °A°



Edit: I was talking specificaly about the video output of a NLE software like Premiere Pro










share|improve this question




















  • 3




    Most online video is compressed, so the real work is decompressing the file, which is general operation handled by the CPU. GPUs would likely actually be very good at handling file compression/decompression if the libraries supported it, but generally they get leveraged for specific calls to draw items on screen, not for general computation. Games take advantage as they're all dynamic, so they really leverage the GPU - video isn't generated on the fly - it's predefined this color pixel at this coordinate, so GPUs aren't useful for that.
    – ernie
    Jul 30 '14 at 21:40












  • Are you talking specifically about video editing software rendering a video file? If not, can you please offer some other examples? How much GPU a video editor uses while rendering depends immensely on the rendering software package being used. As-is this is too broad (IMO).
    – Ƭᴇcʜιᴇ007
    Jul 30 '14 at 22:48












  • Yes, I was talking about that. Sorry if that was unclear.
    – MoritzLost
    Jul 30 '14 at 22:50










  • Then please edit your question to be about the specific problem you're facing, include the software you've tried, what OS, and what you've attempted to correct it. Then ask a specific question about that specific problem. Again, as-is this is just too-broad, as many video rendering packages can use the GPU, if they're configured properly. Then we may be able to help you figure out why yours isn't. :)
    – Ƭᴇcʜιᴇ007
    Jul 30 '14 at 22:53















up vote
8
down vote

favorite
3









up vote
8
down vote

favorite
3






3





Well I know this might sound like a pretty stupid question, but I couldn't find an answer using google, so yeah ...

So, I know there are techniques like OpenCL and CUDA, but why is, by default, the processor used to render e.g. a video file out of a video editing software? Seems counterintuitive to me that the Graphics processing unit is not used to process, well, graphics. When playing a video game, the GPU is in charge of producing the image on my screen as well, isn't it?



Again, I know this may sound stupid to you. Please be gentle °A°



Edit: I was talking specificaly about the video output of a NLE software like Premiere Pro










share|improve this question















Well I know this might sound like a pretty stupid question, but I couldn't find an answer using google, so yeah ...

So, I know there are techniques like OpenCL and CUDA, but why is, by default, the processor used to render e.g. a video file out of a video editing software? Seems counterintuitive to me that the Graphics processing unit is not used to process, well, graphics. When playing a video game, the GPU is in charge of producing the image on my screen as well, isn't it?



Again, I know this may sound stupid to you. Please be gentle °A°



Edit: I was talking specificaly about the video output of a NLE software like Premiere Pro







graphics-card cpu gpu rendering






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Jul 30 '14 at 22:55

























asked Jul 30 '14 at 21:34









MoritzLost

3571515




3571515








  • 3




    Most online video is compressed, so the real work is decompressing the file, which is general operation handled by the CPU. GPUs would likely actually be very good at handling file compression/decompression if the libraries supported it, but generally they get leveraged for specific calls to draw items on screen, not for general computation. Games take advantage as they're all dynamic, so they really leverage the GPU - video isn't generated on the fly - it's predefined this color pixel at this coordinate, so GPUs aren't useful for that.
    – ernie
    Jul 30 '14 at 21:40












  • Are you talking specifically about video editing software rendering a video file? If not, can you please offer some other examples? How much GPU a video editor uses while rendering depends immensely on the rendering software package being used. As-is this is too broad (IMO).
    – Ƭᴇcʜιᴇ007
    Jul 30 '14 at 22:48












  • Yes, I was talking about that. Sorry if that was unclear.
    – MoritzLost
    Jul 30 '14 at 22:50










  • Then please edit your question to be about the specific problem you're facing, include the software you've tried, what OS, and what you've attempted to correct it. Then ask a specific question about that specific problem. Again, as-is this is just too-broad, as many video rendering packages can use the GPU, if they're configured properly. Then we may be able to help you figure out why yours isn't. :)
    – Ƭᴇcʜιᴇ007
    Jul 30 '14 at 22:53
















  • 3




    Most online video is compressed, so the real work is decompressing the file, which is general operation handled by the CPU. GPUs would likely actually be very good at handling file compression/decompression if the libraries supported it, but generally they get leveraged for specific calls to draw items on screen, not for general computation. Games take advantage as they're all dynamic, so they really leverage the GPU - video isn't generated on the fly - it's predefined this color pixel at this coordinate, so GPUs aren't useful for that.
    – ernie
    Jul 30 '14 at 21:40












  • Are you talking specifically about video editing software rendering a video file? If not, can you please offer some other examples? How much GPU a video editor uses while rendering depends immensely on the rendering software package being used. As-is this is too broad (IMO).
    – Ƭᴇcʜιᴇ007
    Jul 30 '14 at 22:48












  • Yes, I was talking about that. Sorry if that was unclear.
    – MoritzLost
    Jul 30 '14 at 22:50










  • Then please edit your question to be about the specific problem you're facing, include the software you've tried, what OS, and what you've attempted to correct it. Then ask a specific question about that specific problem. Again, as-is this is just too-broad, as many video rendering packages can use the GPU, if they're configured properly. Then we may be able to help you figure out why yours isn't. :)
    – Ƭᴇcʜιᴇ007
    Jul 30 '14 at 22:53










3




3




Most online video is compressed, so the real work is decompressing the file, which is general operation handled by the CPU. GPUs would likely actually be very good at handling file compression/decompression if the libraries supported it, but generally they get leveraged for specific calls to draw items on screen, not for general computation. Games take advantage as they're all dynamic, so they really leverage the GPU - video isn't generated on the fly - it's predefined this color pixel at this coordinate, so GPUs aren't useful for that.
– ernie
Jul 30 '14 at 21:40






Most online video is compressed, so the real work is decompressing the file, which is general operation handled by the CPU. GPUs would likely actually be very good at handling file compression/decompression if the libraries supported it, but generally they get leveraged for specific calls to draw items on screen, not for general computation. Games take advantage as they're all dynamic, so they really leverage the GPU - video isn't generated on the fly - it's predefined this color pixel at this coordinate, so GPUs aren't useful for that.
– ernie
Jul 30 '14 at 21:40














Are you talking specifically about video editing software rendering a video file? If not, can you please offer some other examples? How much GPU a video editor uses while rendering depends immensely on the rendering software package being used. As-is this is too broad (IMO).
– Ƭᴇcʜιᴇ007
Jul 30 '14 at 22:48






Are you talking specifically about video editing software rendering a video file? If not, can you please offer some other examples? How much GPU a video editor uses while rendering depends immensely on the rendering software package being used. As-is this is too broad (IMO).
– Ƭᴇcʜιᴇ007
Jul 30 '14 at 22:48














Yes, I was talking about that. Sorry if that was unclear.
– MoritzLost
Jul 30 '14 at 22:50




Yes, I was talking about that. Sorry if that was unclear.
– MoritzLost
Jul 30 '14 at 22:50












Then please edit your question to be about the specific problem you're facing, include the software you've tried, what OS, and what you've attempted to correct it. Then ask a specific question about that specific problem. Again, as-is this is just too-broad, as many video rendering packages can use the GPU, if they're configured properly. Then we may be able to help you figure out why yours isn't. :)
– Ƭᴇcʜιᴇ007
Jul 30 '14 at 22:53






Then please edit your question to be about the specific problem you're facing, include the software you've tried, what OS, and what you've attempted to correct it. Then ask a specific question about that specific problem. Again, as-is this is just too-broad, as many video rendering packages can use the GPU, if they're configured properly. Then we may be able to help you figure out why yours isn't. :)
– Ƭᴇcʜιᴇ007
Jul 30 '14 at 22:53












1 Answer
1






active

oldest

votes

















up vote
13
down vote



accepted










Before HD was a thing, CPUs could handle video decoding easily. When HD became popular about 8 years ago, GPU manufacturers started to implement accelerated video decoding in their chips. You could easily find graphics cards marketed as supporting HD videos and some other slogans. Today any GPU supports accelerated video, even integrated GPUs like Intel HD Graphics or their predecessors, Intel GMA. Without that addition your CPU would have a hard time trying to digest 1080p video with acceptable framerate, not to mention increased energy consumption. So you're already using accelerated video everyday.



Now when GPUs have more and more general use computational power, they are widely used to accelerate video processing too. This trend started around the same time when accelerated decoding was introduced. Programs like Badaboom started to gain popularity as it turned out that GPUs are much better at (re)encoding video than CPUs. It couldn't be done before, though, because GPUs lacked generic computational abilities.



But GPUs could already scale, rotate and transform pictures since middle ages, so why weren't we able to use these features for video processing? Well, these features were never implemented to be used in such way, so they were suboptimal for various reasons.



When you program a game, you first upload all graphics, effects etc. to the GPU and then you just render polygons and map appropriate objects to them. You don't have to send textures each time they are needed, you can load them and reuse them. When it comes to video processing, you have to constantly feed frames to the GPU, process them and fetch them back to reencode them on CPU (remember, we're talking about pre-computational-GPU times). This wasn't how GPUs were supposed to work, so performance wasn't great.



Another thing is, GPUs aren't quality-oriented when it comes to image transformations. When you're playing a game at 40+ fps, you won't really notice slight pixel misrepresentations. Even if you would, game graphics weren't detailed enough for people to care. There are various hacks and tricks used to speed up rendering that can slightly affect quality. Videos are played at rather high framerates too, so scaling them dynamically at playback is acceptable, but reencoding or rendering has to produce results that are pixel-perfect or at least as close as possible at reasonable cost. You can't achieve that without proper features implemented directly in GPU.



Nowadays using GPUs to process videos is quite common because we have required technology in place. Why it's not the default choice is rather a question to program's publisher, not us - it's their choice. Maybe they believe that their clients have hardware oriented to process videos on CPU, so switching to GPU will negatively affect performance, but that's just my guess. Another possibility is that they still treat GPU rendering as experimental feature that's not stable enough to set it as a default yet. You don't want to waste hours rendering your video just to realize something is screwed up due to GPU rendering bug. If you decide to use it anyway, then you can't blame the software publisher - it was your decision.






share|improve this answer





















    Your Answer








    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "3"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f790418%2fwhy-are-videos-rendered-by-the-cpu-instead-of-the-gpu%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    13
    down vote



    accepted










    Before HD was a thing, CPUs could handle video decoding easily. When HD became popular about 8 years ago, GPU manufacturers started to implement accelerated video decoding in their chips. You could easily find graphics cards marketed as supporting HD videos and some other slogans. Today any GPU supports accelerated video, even integrated GPUs like Intel HD Graphics or their predecessors, Intel GMA. Without that addition your CPU would have a hard time trying to digest 1080p video with acceptable framerate, not to mention increased energy consumption. So you're already using accelerated video everyday.



    Now when GPUs have more and more general use computational power, they are widely used to accelerate video processing too. This trend started around the same time when accelerated decoding was introduced. Programs like Badaboom started to gain popularity as it turned out that GPUs are much better at (re)encoding video than CPUs. It couldn't be done before, though, because GPUs lacked generic computational abilities.



    But GPUs could already scale, rotate and transform pictures since middle ages, so why weren't we able to use these features for video processing? Well, these features were never implemented to be used in such way, so they were suboptimal for various reasons.



    When you program a game, you first upload all graphics, effects etc. to the GPU and then you just render polygons and map appropriate objects to them. You don't have to send textures each time they are needed, you can load them and reuse them. When it comes to video processing, you have to constantly feed frames to the GPU, process them and fetch them back to reencode them on CPU (remember, we're talking about pre-computational-GPU times). This wasn't how GPUs were supposed to work, so performance wasn't great.



    Another thing is, GPUs aren't quality-oriented when it comes to image transformations. When you're playing a game at 40+ fps, you won't really notice slight pixel misrepresentations. Even if you would, game graphics weren't detailed enough for people to care. There are various hacks and tricks used to speed up rendering that can slightly affect quality. Videos are played at rather high framerates too, so scaling them dynamically at playback is acceptable, but reencoding or rendering has to produce results that are pixel-perfect or at least as close as possible at reasonable cost. You can't achieve that without proper features implemented directly in GPU.



    Nowadays using GPUs to process videos is quite common because we have required technology in place. Why it's not the default choice is rather a question to program's publisher, not us - it's their choice. Maybe they believe that their clients have hardware oriented to process videos on CPU, so switching to GPU will negatively affect performance, but that's just my guess. Another possibility is that they still treat GPU rendering as experimental feature that's not stable enough to set it as a default yet. You don't want to waste hours rendering your video just to realize something is screwed up due to GPU rendering bug. If you decide to use it anyway, then you can't blame the software publisher - it was your decision.






    share|improve this answer

























      up vote
      13
      down vote



      accepted










      Before HD was a thing, CPUs could handle video decoding easily. When HD became popular about 8 years ago, GPU manufacturers started to implement accelerated video decoding in their chips. You could easily find graphics cards marketed as supporting HD videos and some other slogans. Today any GPU supports accelerated video, even integrated GPUs like Intel HD Graphics or their predecessors, Intel GMA. Without that addition your CPU would have a hard time trying to digest 1080p video with acceptable framerate, not to mention increased energy consumption. So you're already using accelerated video everyday.



      Now when GPUs have more and more general use computational power, they are widely used to accelerate video processing too. This trend started around the same time when accelerated decoding was introduced. Programs like Badaboom started to gain popularity as it turned out that GPUs are much better at (re)encoding video than CPUs. It couldn't be done before, though, because GPUs lacked generic computational abilities.



      But GPUs could already scale, rotate and transform pictures since middle ages, so why weren't we able to use these features for video processing? Well, these features were never implemented to be used in such way, so they were suboptimal for various reasons.



      When you program a game, you first upload all graphics, effects etc. to the GPU and then you just render polygons and map appropriate objects to them. You don't have to send textures each time they are needed, you can load them and reuse them. When it comes to video processing, you have to constantly feed frames to the GPU, process them and fetch them back to reencode them on CPU (remember, we're talking about pre-computational-GPU times). This wasn't how GPUs were supposed to work, so performance wasn't great.



      Another thing is, GPUs aren't quality-oriented when it comes to image transformations. When you're playing a game at 40+ fps, you won't really notice slight pixel misrepresentations. Even if you would, game graphics weren't detailed enough for people to care. There are various hacks and tricks used to speed up rendering that can slightly affect quality. Videos are played at rather high framerates too, so scaling them dynamically at playback is acceptable, but reencoding or rendering has to produce results that are pixel-perfect or at least as close as possible at reasonable cost. You can't achieve that without proper features implemented directly in GPU.



      Nowadays using GPUs to process videos is quite common because we have required technology in place. Why it's not the default choice is rather a question to program's publisher, not us - it's their choice. Maybe they believe that their clients have hardware oriented to process videos on CPU, so switching to GPU will negatively affect performance, but that's just my guess. Another possibility is that they still treat GPU rendering as experimental feature that's not stable enough to set it as a default yet. You don't want to waste hours rendering your video just to realize something is screwed up due to GPU rendering bug. If you decide to use it anyway, then you can't blame the software publisher - it was your decision.






      share|improve this answer























        up vote
        13
        down vote



        accepted







        up vote
        13
        down vote



        accepted






        Before HD was a thing, CPUs could handle video decoding easily. When HD became popular about 8 years ago, GPU manufacturers started to implement accelerated video decoding in their chips. You could easily find graphics cards marketed as supporting HD videos and some other slogans. Today any GPU supports accelerated video, even integrated GPUs like Intel HD Graphics or their predecessors, Intel GMA. Without that addition your CPU would have a hard time trying to digest 1080p video with acceptable framerate, not to mention increased energy consumption. So you're already using accelerated video everyday.



        Now when GPUs have more and more general use computational power, they are widely used to accelerate video processing too. This trend started around the same time when accelerated decoding was introduced. Programs like Badaboom started to gain popularity as it turned out that GPUs are much better at (re)encoding video than CPUs. It couldn't be done before, though, because GPUs lacked generic computational abilities.



        But GPUs could already scale, rotate and transform pictures since middle ages, so why weren't we able to use these features for video processing? Well, these features were never implemented to be used in such way, so they were suboptimal for various reasons.



        When you program a game, you first upload all graphics, effects etc. to the GPU and then you just render polygons and map appropriate objects to them. You don't have to send textures each time they are needed, you can load them and reuse them. When it comes to video processing, you have to constantly feed frames to the GPU, process them and fetch them back to reencode them on CPU (remember, we're talking about pre-computational-GPU times). This wasn't how GPUs were supposed to work, so performance wasn't great.



        Another thing is, GPUs aren't quality-oriented when it comes to image transformations. When you're playing a game at 40+ fps, you won't really notice slight pixel misrepresentations. Even if you would, game graphics weren't detailed enough for people to care. There are various hacks and tricks used to speed up rendering that can slightly affect quality. Videos are played at rather high framerates too, so scaling them dynamically at playback is acceptable, but reencoding or rendering has to produce results that are pixel-perfect or at least as close as possible at reasonable cost. You can't achieve that without proper features implemented directly in GPU.



        Nowadays using GPUs to process videos is quite common because we have required technology in place. Why it's not the default choice is rather a question to program's publisher, not us - it's their choice. Maybe they believe that their clients have hardware oriented to process videos on CPU, so switching to GPU will negatively affect performance, but that's just my guess. Another possibility is that they still treat GPU rendering as experimental feature that's not stable enough to set it as a default yet. You don't want to waste hours rendering your video just to realize something is screwed up due to GPU rendering bug. If you decide to use it anyway, then you can't blame the software publisher - it was your decision.






        share|improve this answer












        Before HD was a thing, CPUs could handle video decoding easily. When HD became popular about 8 years ago, GPU manufacturers started to implement accelerated video decoding in their chips. You could easily find graphics cards marketed as supporting HD videos and some other slogans. Today any GPU supports accelerated video, even integrated GPUs like Intel HD Graphics or their predecessors, Intel GMA. Without that addition your CPU would have a hard time trying to digest 1080p video with acceptable framerate, not to mention increased energy consumption. So you're already using accelerated video everyday.



        Now when GPUs have more and more general use computational power, they are widely used to accelerate video processing too. This trend started around the same time when accelerated decoding was introduced. Programs like Badaboom started to gain popularity as it turned out that GPUs are much better at (re)encoding video than CPUs. It couldn't be done before, though, because GPUs lacked generic computational abilities.



        But GPUs could already scale, rotate and transform pictures since middle ages, so why weren't we able to use these features for video processing? Well, these features were never implemented to be used in such way, so they were suboptimal for various reasons.



        When you program a game, you first upload all graphics, effects etc. to the GPU and then you just render polygons and map appropriate objects to them. You don't have to send textures each time they are needed, you can load them and reuse them. When it comes to video processing, you have to constantly feed frames to the GPU, process them and fetch them back to reencode them on CPU (remember, we're talking about pre-computational-GPU times). This wasn't how GPUs were supposed to work, so performance wasn't great.



        Another thing is, GPUs aren't quality-oriented when it comes to image transformations. When you're playing a game at 40+ fps, you won't really notice slight pixel misrepresentations. Even if you would, game graphics weren't detailed enough for people to care. There are various hacks and tricks used to speed up rendering that can slightly affect quality. Videos are played at rather high framerates too, so scaling them dynamically at playback is acceptable, but reencoding or rendering has to produce results that are pixel-perfect or at least as close as possible at reasonable cost. You can't achieve that without proper features implemented directly in GPU.



        Nowadays using GPUs to process videos is quite common because we have required technology in place. Why it's not the default choice is rather a question to program's publisher, not us - it's their choice. Maybe they believe that their clients have hardware oriented to process videos on CPU, so switching to GPU will negatively affect performance, but that's just my guess. Another possibility is that they still treat GPU rendering as experimental feature that's not stable enough to set it as a default yet. You don't want to waste hours rendering your video just to realize something is screwed up due to GPU rendering bug. If you decide to use it anyway, then you can't blame the software publisher - it was your decision.







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Jul 30 '14 at 22:35









        gronostaj

        27.8k1368107




        27.8k1368107






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Super User!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f790418%2fwhy-are-videos-rendered-by-the-cpu-instead-of-the-gpu%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            flock() on closed filehandle LOCK_FILE at /usr/bin/apt-mirror

            Mangá

             ⁒  ․,‪⁊‑⁙ ⁖, ⁇‒※‌, †,⁖‗‌⁝    ‾‸⁘,‖⁔⁣,⁂‾
”‑,‥–,‬ ,⁀‹⁋‴⁑ ‒ ,‴⁋”‼ ⁨,‷⁔„ ‰′,‐‚ ‥‡‎“‷⁃⁨⁅⁣,⁔
⁇‘⁔⁡⁏⁌⁡‿‶‏⁨ ⁣⁕⁖⁨⁩⁥‽⁀  ‴‬⁜‟ ⁃‣‧⁕‮ …‍⁨‴ ⁩,⁚⁖‫ ,‵ ⁀,‮⁝‣‣ ⁑  ⁂– ․, ‾‽ ‏⁁“⁗‸ ‾… ‹‡⁌⁎‸‘ ‡⁏⁌‪ ‵⁛ ‎⁨ ―⁦⁤⁄⁕