FFmpeg fails to convert webm files when h264_nvenc forced












2















I have been working with below environment.



==>Ubuntu 16.04.3



==>FFmpeg :3.4.2



-- configuration: --prefix=/usr/local/ffmpeg_new/ --enable-cuda --enable-cuvid --enable-nvenc --enable-nonfree --enable-libnpp --extra-cflags=-I/usr/local/cuda/include --extra-ldflags=-L/usr/local/cuda/lib64 --enable-libvpx --enable-libvorbis --enable-libfreetype



==>Cuda 9.1.85



==> GPU: GeForce GTX 1080ti



My purpose is to accelerate video modifications via GPU source since there is high rate traffic on my media server.



Here is the main steps of my process:



1.Split video and audio of the caller.



time ffmpeg -y -i 230087_caller.webm -vn -ab 256 230087_caller.wav



2.Split video and audio of the callee



time ffmpeg -y -i 230087_callee.webm -vn -ab 256 230087_callee.wav



3.Mux original caller video with callee audio to store the conversation.



time ffmpeg -y -i 230087_caller.webm -i 230087_callee.wav -filter_complex '[0:a]aformat=sample_fmts=fltp:sample_rates=44100:channel_layouts=stereo,volume=0.5[a1]; [1:a]aformat=sample_fmts=fltp:sample_rates=44100:channel_layouts=stereo,volume=0.5[a2]; [a1][a2]amerge,pan=stereo|c0


4.Add time fontfile to the caller_temp fil which fails with below error.



time ffmpeg -y -i caller_temp.webm -vf drawtext=fontfile=/usr/share/fonts/truetype/dejavu/DejaVuSans-Bold.ttf:x=8:y=8:fontsize=16:fontcolor=yellow@1:expansion=strftime:basetime=1518172680000000:text='%Y-%m-%d %H-%M-%S' -strict -2 -shortest -c:a libvorbis -c:v h264_nvenc final_font_test.webm



[webm @ 0x29e8540] Only VP8 or VP9 video and Vorbis or Opus audio and WebVTT subtitles are supported for WebM.
av_interleaved_write_frame(): Invalid argument
Error writing trailer of output_temwp.webm: Invalid argument



==================================



Here is the details of my webm file :



Stream #0:0(eng): Video: vp8, yuv420p, 640x480, SAR 1:1 DAR 4:3, 27.58 tbr, 1k tbn, 1k tbc (default)
Metadata:
title : Video
Stream #0:1(eng): Audio: opus, 48000 Hz, stereo, fltp (default)



==================================



I have take a look at the nvidia video encode/Decode support matrix,my gpu seems should be supporting this conversion.



https://developer.nvidia.com/video-encode-decode-gpu-support-matrix



=================================



However,when I have changed the output file format to mp4/avi/mpeg GPU can be used without any issue.Yet ,webm is important for us as the other container file size becomes risky with storage perspective of view.



I will be appreciated if you share your comments and feeedbacks to proceed with this issue.



Thanks in advance.
Regards










share|improve this question























  • You're using a H264 encoder which WebM container doesn't support. You need a VP8/9 encoder that make use of nvidia hardware, but I don't see one available.

    – Gyan
    Feb 28 '18 at 6:09











  • Thank you Mulvya for your feedbacks . Do you have any recommendation in regards which hardware can support vp8/vp9 encoding instead of nvidia geforce ? Or do you have any recommendation to get the best output performance with this existing GPU ,I mean different type of output container.

    – oktay eşgül
    Feb 28 '18 at 7:18













  • You can save to MKV.

    – Gyan
    Feb 28 '18 at 7:27











  • Any comment for vp8/vp9 supporting GPU hardware.

    – oktay eşgül
    Feb 28 '18 at 7:30











  • On linux, you can use VAAPI to use Intel GPUs for VP8/9.

    – Gyan
    Feb 28 '18 at 7:31
















2















I have been working with below environment.



==>Ubuntu 16.04.3



==>FFmpeg :3.4.2



-- configuration: --prefix=/usr/local/ffmpeg_new/ --enable-cuda --enable-cuvid --enable-nvenc --enable-nonfree --enable-libnpp --extra-cflags=-I/usr/local/cuda/include --extra-ldflags=-L/usr/local/cuda/lib64 --enable-libvpx --enable-libvorbis --enable-libfreetype



==>Cuda 9.1.85



==> GPU: GeForce GTX 1080ti



My purpose is to accelerate video modifications via GPU source since there is high rate traffic on my media server.



Here is the main steps of my process:



1.Split video and audio of the caller.



time ffmpeg -y -i 230087_caller.webm -vn -ab 256 230087_caller.wav



2.Split video and audio of the callee



time ffmpeg -y -i 230087_callee.webm -vn -ab 256 230087_callee.wav



3.Mux original caller video with callee audio to store the conversation.



time ffmpeg -y -i 230087_caller.webm -i 230087_callee.wav -filter_complex '[0:a]aformat=sample_fmts=fltp:sample_rates=44100:channel_layouts=stereo,volume=0.5[a1]; [1:a]aformat=sample_fmts=fltp:sample_rates=44100:channel_layouts=stereo,volume=0.5[a2]; [a1][a2]amerge,pan=stereo|c0


4.Add time fontfile to the caller_temp fil which fails with below error.



time ffmpeg -y -i caller_temp.webm -vf drawtext=fontfile=/usr/share/fonts/truetype/dejavu/DejaVuSans-Bold.ttf:x=8:y=8:fontsize=16:fontcolor=yellow@1:expansion=strftime:basetime=1518172680000000:text='%Y-%m-%d %H-%M-%S' -strict -2 -shortest -c:a libvorbis -c:v h264_nvenc final_font_test.webm



[webm @ 0x29e8540] Only VP8 or VP9 video and Vorbis or Opus audio and WebVTT subtitles are supported for WebM.
av_interleaved_write_frame(): Invalid argument
Error writing trailer of output_temwp.webm: Invalid argument



==================================



Here is the details of my webm file :



Stream #0:0(eng): Video: vp8, yuv420p, 640x480, SAR 1:1 DAR 4:3, 27.58 tbr, 1k tbn, 1k tbc (default)
Metadata:
title : Video
Stream #0:1(eng): Audio: opus, 48000 Hz, stereo, fltp (default)



==================================



I have take a look at the nvidia video encode/Decode support matrix,my gpu seems should be supporting this conversion.



https://developer.nvidia.com/video-encode-decode-gpu-support-matrix



=================================



However,when I have changed the output file format to mp4/avi/mpeg GPU can be used without any issue.Yet ,webm is important for us as the other container file size becomes risky with storage perspective of view.



I will be appreciated if you share your comments and feeedbacks to proceed with this issue.



Thanks in advance.
Regards










share|improve this question























  • You're using a H264 encoder which WebM container doesn't support. You need a VP8/9 encoder that make use of nvidia hardware, but I don't see one available.

    – Gyan
    Feb 28 '18 at 6:09











  • Thank you Mulvya for your feedbacks . Do you have any recommendation in regards which hardware can support vp8/vp9 encoding instead of nvidia geforce ? Or do you have any recommendation to get the best output performance with this existing GPU ,I mean different type of output container.

    – oktay eşgül
    Feb 28 '18 at 7:18













  • You can save to MKV.

    – Gyan
    Feb 28 '18 at 7:27











  • Any comment for vp8/vp9 supporting GPU hardware.

    – oktay eşgül
    Feb 28 '18 at 7:30











  • On linux, you can use VAAPI to use Intel GPUs for VP8/9.

    – Gyan
    Feb 28 '18 at 7:31














2












2








2








I have been working with below environment.



==>Ubuntu 16.04.3



==>FFmpeg :3.4.2



-- configuration: --prefix=/usr/local/ffmpeg_new/ --enable-cuda --enable-cuvid --enable-nvenc --enable-nonfree --enable-libnpp --extra-cflags=-I/usr/local/cuda/include --extra-ldflags=-L/usr/local/cuda/lib64 --enable-libvpx --enable-libvorbis --enable-libfreetype



==>Cuda 9.1.85



==> GPU: GeForce GTX 1080ti



My purpose is to accelerate video modifications via GPU source since there is high rate traffic on my media server.



Here is the main steps of my process:



1.Split video and audio of the caller.



time ffmpeg -y -i 230087_caller.webm -vn -ab 256 230087_caller.wav



2.Split video and audio of the callee



time ffmpeg -y -i 230087_callee.webm -vn -ab 256 230087_callee.wav



3.Mux original caller video with callee audio to store the conversation.



time ffmpeg -y -i 230087_caller.webm -i 230087_callee.wav -filter_complex '[0:a]aformat=sample_fmts=fltp:sample_rates=44100:channel_layouts=stereo,volume=0.5[a1]; [1:a]aformat=sample_fmts=fltp:sample_rates=44100:channel_layouts=stereo,volume=0.5[a2]; [a1][a2]amerge,pan=stereo|c0


4.Add time fontfile to the caller_temp fil which fails with below error.



time ffmpeg -y -i caller_temp.webm -vf drawtext=fontfile=/usr/share/fonts/truetype/dejavu/DejaVuSans-Bold.ttf:x=8:y=8:fontsize=16:fontcolor=yellow@1:expansion=strftime:basetime=1518172680000000:text='%Y-%m-%d %H-%M-%S' -strict -2 -shortest -c:a libvorbis -c:v h264_nvenc final_font_test.webm



[webm @ 0x29e8540] Only VP8 or VP9 video and Vorbis or Opus audio and WebVTT subtitles are supported for WebM.
av_interleaved_write_frame(): Invalid argument
Error writing trailer of output_temwp.webm: Invalid argument



==================================



Here is the details of my webm file :



Stream #0:0(eng): Video: vp8, yuv420p, 640x480, SAR 1:1 DAR 4:3, 27.58 tbr, 1k tbn, 1k tbc (default)
Metadata:
title : Video
Stream #0:1(eng): Audio: opus, 48000 Hz, stereo, fltp (default)



==================================



I have take a look at the nvidia video encode/Decode support matrix,my gpu seems should be supporting this conversion.



https://developer.nvidia.com/video-encode-decode-gpu-support-matrix



=================================



However,when I have changed the output file format to mp4/avi/mpeg GPU can be used without any issue.Yet ,webm is important for us as the other container file size becomes risky with storage perspective of view.



I will be appreciated if you share your comments and feeedbacks to proceed with this issue.



Thanks in advance.
Regards










share|improve this question














I have been working with below environment.



==>Ubuntu 16.04.3



==>FFmpeg :3.4.2



-- configuration: --prefix=/usr/local/ffmpeg_new/ --enable-cuda --enable-cuvid --enable-nvenc --enable-nonfree --enable-libnpp --extra-cflags=-I/usr/local/cuda/include --extra-ldflags=-L/usr/local/cuda/lib64 --enable-libvpx --enable-libvorbis --enable-libfreetype



==>Cuda 9.1.85



==> GPU: GeForce GTX 1080ti



My purpose is to accelerate video modifications via GPU source since there is high rate traffic on my media server.



Here is the main steps of my process:



1.Split video and audio of the caller.



time ffmpeg -y -i 230087_caller.webm -vn -ab 256 230087_caller.wav



2.Split video and audio of the callee



time ffmpeg -y -i 230087_callee.webm -vn -ab 256 230087_callee.wav



3.Mux original caller video with callee audio to store the conversation.



time ffmpeg -y -i 230087_caller.webm -i 230087_callee.wav -filter_complex '[0:a]aformat=sample_fmts=fltp:sample_rates=44100:channel_layouts=stereo,volume=0.5[a1]; [1:a]aformat=sample_fmts=fltp:sample_rates=44100:channel_layouts=stereo,volume=0.5[a2]; [a1][a2]amerge,pan=stereo|c0


4.Add time fontfile to the caller_temp fil which fails with below error.



time ffmpeg -y -i caller_temp.webm -vf drawtext=fontfile=/usr/share/fonts/truetype/dejavu/DejaVuSans-Bold.ttf:x=8:y=8:fontsize=16:fontcolor=yellow@1:expansion=strftime:basetime=1518172680000000:text='%Y-%m-%d %H-%M-%S' -strict -2 -shortest -c:a libvorbis -c:v h264_nvenc final_font_test.webm



[webm @ 0x29e8540] Only VP8 or VP9 video and Vorbis or Opus audio and WebVTT subtitles are supported for WebM.
av_interleaved_write_frame(): Invalid argument
Error writing trailer of output_temwp.webm: Invalid argument



==================================



Here is the details of my webm file :



Stream #0:0(eng): Video: vp8, yuv420p, 640x480, SAR 1:1 DAR 4:3, 27.58 tbr, 1k tbn, 1k tbc (default)
Metadata:
title : Video
Stream #0:1(eng): Audio: opus, 48000 Hz, stereo, fltp (default)



==================================



I have take a look at the nvidia video encode/Decode support matrix,my gpu seems should be supporting this conversion.



https://developer.nvidia.com/video-encode-decode-gpu-support-matrix



=================================



However,when I have changed the output file format to mp4/avi/mpeg GPU can be used without any issue.Yet ,webm is important for us as the other container file size becomes risky with storage perspective of view.



I will be appreciated if you share your comments and feeedbacks to proceed with this issue.



Thanks in advance.
Regards







ffmpeg nvidia-geforce webm






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Feb 28 '18 at 5:22









oktay eşgüloktay eşgül

112




112













  • You're using a H264 encoder which WebM container doesn't support. You need a VP8/9 encoder that make use of nvidia hardware, but I don't see one available.

    – Gyan
    Feb 28 '18 at 6:09











  • Thank you Mulvya for your feedbacks . Do you have any recommendation in regards which hardware can support vp8/vp9 encoding instead of nvidia geforce ? Or do you have any recommendation to get the best output performance with this existing GPU ,I mean different type of output container.

    – oktay eşgül
    Feb 28 '18 at 7:18













  • You can save to MKV.

    – Gyan
    Feb 28 '18 at 7:27











  • Any comment for vp8/vp9 supporting GPU hardware.

    – oktay eşgül
    Feb 28 '18 at 7:30











  • On linux, you can use VAAPI to use Intel GPUs for VP8/9.

    – Gyan
    Feb 28 '18 at 7:31



















  • You're using a H264 encoder which WebM container doesn't support. You need a VP8/9 encoder that make use of nvidia hardware, but I don't see one available.

    – Gyan
    Feb 28 '18 at 6:09











  • Thank you Mulvya for your feedbacks . Do you have any recommendation in regards which hardware can support vp8/vp9 encoding instead of nvidia geforce ? Or do you have any recommendation to get the best output performance with this existing GPU ,I mean different type of output container.

    – oktay eşgül
    Feb 28 '18 at 7:18













  • You can save to MKV.

    – Gyan
    Feb 28 '18 at 7:27











  • Any comment for vp8/vp9 supporting GPU hardware.

    – oktay eşgül
    Feb 28 '18 at 7:30











  • On linux, you can use VAAPI to use Intel GPUs for VP8/9.

    – Gyan
    Feb 28 '18 at 7:31

















You're using a H264 encoder which WebM container doesn't support. You need a VP8/9 encoder that make use of nvidia hardware, but I don't see one available.

– Gyan
Feb 28 '18 at 6:09





You're using a H264 encoder which WebM container doesn't support. You need a VP8/9 encoder that make use of nvidia hardware, but I don't see one available.

– Gyan
Feb 28 '18 at 6:09













Thank you Mulvya for your feedbacks . Do you have any recommendation in regards which hardware can support vp8/vp9 encoding instead of nvidia geforce ? Or do you have any recommendation to get the best output performance with this existing GPU ,I mean different type of output container.

– oktay eşgül
Feb 28 '18 at 7:18







Thank you Mulvya for your feedbacks . Do you have any recommendation in regards which hardware can support vp8/vp9 encoding instead of nvidia geforce ? Or do you have any recommendation to get the best output performance with this existing GPU ,I mean different type of output container.

– oktay eşgül
Feb 28 '18 at 7:18















You can save to MKV.

– Gyan
Feb 28 '18 at 7:27





You can save to MKV.

– Gyan
Feb 28 '18 at 7:27













Any comment for vp8/vp9 supporting GPU hardware.

– oktay eşgül
Feb 28 '18 at 7:30





Any comment for vp8/vp9 supporting GPU hardware.

– oktay eşgül
Feb 28 '18 at 7:30













On linux, you can use VAAPI to use Intel GPUs for VP8/9.

– Gyan
Feb 28 '18 at 7:31





On linux, you can use VAAPI to use Intel GPUs for VP8/9.

– Gyan
Feb 28 '18 at 7:31










1 Answer
1






active

oldest

votes


















0














According to the Support Matrix NVENC only supports AVCHD (H.264) and HEVC (H.265).



You can use NVDEC to decode VP8/9 given a compatible GPU.



For hardware supported encoding of VP8/VP9 please check VAAPI encoder support in FFMPEG.



To use the default decoder for some input, then upload frames to VAAPI and encode with VP9 and default settings:



ffmpeg -vaapi_device /dev/dri/renderD128 -i input.mp4 -vf 'format=nv12,hwupload' -c:v vp9_vaapi output.webm





share|improve this answer
























  • The Mesa VAAPI driver uses the UVD (Unified Video Decoder) and VCE (Video Coding Engine) hardware found in all recent AMD graphics cards and APUs.

    – der_michael
    Jan 8 at 5:26













  • Quicksync supports VP8 encoding since Braswell and VP9 since Apollo Lake, see: trac.ffmpeg.org/wiki/Hardware/QuickSync

    – der_michael
    Jan 8 at 5:28













Your Answer








StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "3"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f1299034%2fffmpeg-fails-to-convert-webm-files-when-h264-nvenc-forced%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









0














According to the Support Matrix NVENC only supports AVCHD (H.264) and HEVC (H.265).



You can use NVDEC to decode VP8/9 given a compatible GPU.



For hardware supported encoding of VP8/VP9 please check VAAPI encoder support in FFMPEG.



To use the default decoder for some input, then upload frames to VAAPI and encode with VP9 and default settings:



ffmpeg -vaapi_device /dev/dri/renderD128 -i input.mp4 -vf 'format=nv12,hwupload' -c:v vp9_vaapi output.webm





share|improve this answer
























  • The Mesa VAAPI driver uses the UVD (Unified Video Decoder) and VCE (Video Coding Engine) hardware found in all recent AMD graphics cards and APUs.

    – der_michael
    Jan 8 at 5:26













  • Quicksync supports VP8 encoding since Braswell and VP9 since Apollo Lake, see: trac.ffmpeg.org/wiki/Hardware/QuickSync

    – der_michael
    Jan 8 at 5:28


















0














According to the Support Matrix NVENC only supports AVCHD (H.264) and HEVC (H.265).



You can use NVDEC to decode VP8/9 given a compatible GPU.



For hardware supported encoding of VP8/VP9 please check VAAPI encoder support in FFMPEG.



To use the default decoder for some input, then upload frames to VAAPI and encode with VP9 and default settings:



ffmpeg -vaapi_device /dev/dri/renderD128 -i input.mp4 -vf 'format=nv12,hwupload' -c:v vp9_vaapi output.webm





share|improve this answer
























  • The Mesa VAAPI driver uses the UVD (Unified Video Decoder) and VCE (Video Coding Engine) hardware found in all recent AMD graphics cards and APUs.

    – der_michael
    Jan 8 at 5:26













  • Quicksync supports VP8 encoding since Braswell and VP9 since Apollo Lake, see: trac.ffmpeg.org/wiki/Hardware/QuickSync

    – der_michael
    Jan 8 at 5:28
















0












0








0







According to the Support Matrix NVENC only supports AVCHD (H.264) and HEVC (H.265).



You can use NVDEC to decode VP8/9 given a compatible GPU.



For hardware supported encoding of VP8/VP9 please check VAAPI encoder support in FFMPEG.



To use the default decoder for some input, then upload frames to VAAPI and encode with VP9 and default settings:



ffmpeg -vaapi_device /dev/dri/renderD128 -i input.mp4 -vf 'format=nv12,hwupload' -c:v vp9_vaapi output.webm





share|improve this answer













According to the Support Matrix NVENC only supports AVCHD (H.264) and HEVC (H.265).



You can use NVDEC to decode VP8/9 given a compatible GPU.



For hardware supported encoding of VP8/VP9 please check VAAPI encoder support in FFMPEG.



To use the default decoder for some input, then upload frames to VAAPI and encode with VP9 and default settings:



ffmpeg -vaapi_device /dev/dri/renderD128 -i input.mp4 -vf 'format=nv12,hwupload' -c:v vp9_vaapi output.webm






share|improve this answer












share|improve this answer



share|improve this answer










answered Jan 8 at 5:17









der_michaelder_michael

2910




2910













  • The Mesa VAAPI driver uses the UVD (Unified Video Decoder) and VCE (Video Coding Engine) hardware found in all recent AMD graphics cards and APUs.

    – der_michael
    Jan 8 at 5:26













  • Quicksync supports VP8 encoding since Braswell and VP9 since Apollo Lake, see: trac.ffmpeg.org/wiki/Hardware/QuickSync

    – der_michael
    Jan 8 at 5:28





















  • The Mesa VAAPI driver uses the UVD (Unified Video Decoder) and VCE (Video Coding Engine) hardware found in all recent AMD graphics cards and APUs.

    – der_michael
    Jan 8 at 5:26













  • Quicksync supports VP8 encoding since Braswell and VP9 since Apollo Lake, see: trac.ffmpeg.org/wiki/Hardware/QuickSync

    – der_michael
    Jan 8 at 5:28



















The Mesa VAAPI driver uses the UVD (Unified Video Decoder) and VCE (Video Coding Engine) hardware found in all recent AMD graphics cards and APUs.

– der_michael
Jan 8 at 5:26







The Mesa VAAPI driver uses the UVD (Unified Video Decoder) and VCE (Video Coding Engine) hardware found in all recent AMD graphics cards and APUs.

– der_michael
Jan 8 at 5:26















Quicksync supports VP8 encoding since Braswell and VP9 since Apollo Lake, see: trac.ffmpeg.org/wiki/Hardware/QuickSync

– der_michael
Jan 8 at 5:28







Quicksync supports VP8 encoding since Braswell and VP9 since Apollo Lake, see: trac.ffmpeg.org/wiki/Hardware/QuickSync

– der_michael
Jan 8 at 5:28




















draft saved

draft discarded




















































Thanks for contributing an answer to Super User!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f1299034%2fffmpeg-fails-to-convert-webm-files-when-h264-nvenc-forced%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

flock() on closed filehandle LOCK_FILE at /usr/bin/apt-mirror

Mangá

 ⁒  ․,‪⁊‑⁙ ⁖, ⁇‒※‌, †,⁖‗‌⁝    ‾‸⁘,‖⁔⁣,⁂‾
”‑,‥–,‬ ,⁀‹⁋‴⁑ ‒ ,‴⁋”‼ ⁨,‷⁔„ ‰′,‐‚ ‥‡‎“‷⁃⁨⁅⁣,⁔
⁇‘⁔⁡⁏⁌⁡‿‶‏⁨ ⁣⁕⁖⁨⁩⁥‽⁀  ‴‬⁜‟ ⁃‣‧⁕‮ …‍⁨‴ ⁩,⁚⁖‫ ,‵ ⁀,‮⁝‣‣ ⁑  ⁂– ․, ‾‽ ‏⁁“⁗‸ ‾… ‹‡⁌⁎‸‘ ‡⁏⁌‪ ‵⁛ ‎⁨ ―⁦⁤⁄⁕