User avatar
vivan
Expert User
Posts: 193
Joined: Sat Dec 19, 2009 10:56 am
PC Specification: Acer Aspire TimelineX 3830TG
Location: St. Petersburg, Russia

Re: Best settings & decoder

Mon Sep 26, 2011 5:18 pm

P.J wrote:Each video card has different video quality:
This article doesn't make any sence.
They all are equal. Video decoding can be correct or broken, it cound't be better or worse. Video rendering is equal and depends on software (e.g. madVR). Deinterlacing is different, but, anyway software deinterlacing obviously has better quality... and who use interlace these days?
The only thing left is video postprocessing, but it useless on HQ sources.

Btw, the thing I hate in my destop ati card most - with every driver update it switches on ugly postprocessing that can lead to eye cancer.

So... if you are using proper software you have best quality on any card that can handle it (and even my weak integrated intel graphics can handle 1080p@24fps with madVR, but on 60 fps I have to switch to discrete card).
Desktop (Intel i7-970, ATI 5870, Windows 7 x64)
Acer Aspire TimelineX 3830TG (Intel i5-2410M, nVidia GT540M, Windows 7 x64)
PS Vita, Nokia N8

User avatar
P.J
Posts: 300
Joined: Mon Jun 27, 2011 7:25 pm
PC Specification: i5-4460, 16GB RAM, GTX 960, Win10 x64

Re: Best settings & decoder

Mon Sep 26, 2011 8:51 pm

vivan wrote:
P.J wrote:Each video card has different video quality:
This article doesn't make any sence.
They all are equal. Video decoding can be correct or broken, it cound't be better or worse. Video rendering is equal and depends on software (e.g. madVR). Deinterlacing is different, but, anyway software deinterlacing obviously has better quality... and who use interlace these days?
The only thing left is video postprocessing, but it useless on HQ sources.

Btw, the thing I hate in my destop ati card most - with every driver update it switches on ugly postprocessing that can lead to eye cancer.

So... if you are using proper software you have best quality on any card that can handle it (and even my weak integrated intel graphics can handle 1080p@24fps with madVR, but on 60 fps I have to switch to discrete card).
You forgot many things ;)
Most of the videos are interlaced and the most of computers don't have powerful CPU.
When you have to use GPU, you can't run de-interlacing on CPU.
Software decoding isn't right way, only good for some cases like 1080p60 or 2K/4K
Because most of GPUs can't handle it. The benefit of Splash is using hardware decoding.
Just look at the CPU usage while decoding 1080, then compare it with Samsung Galaxy SII,
Ain't it crazy to burn CPU power for decoding it?
Did you forget CrystalHD or your previous netbook, hp mini 311?
And madVR is just a codec. Using it with some programs like MPC-HC is painful.

Knight
Posts: 12
Joined: Tue May 24, 2011 7:11 pm
PC Specification: Don't remember... :-D

Re: Best settings & decoder

Mon Sep 26, 2011 9:37 pm

Honestly in these days I'm having a dilemma: use the "effect" of the catalyst or turn them off. Use them or use light and detail bost of splash?

Using many effect the picture looks better but sometimes...too fake.

Use Edge-enhancement? Suggested settings in some forum is 35 to me is too much. And isn't it like Detail Boost of Splash?

Use Dinamic Contrast? Nice but pump it up too much? And isn't it like Light Boost on Splash?

And I don't wanna add Skin tone correction, color vibrance, dinamic range, Enforce smooth playback...

I'm really lost. Days and days to try but dunno what to keep and what not.

The only thing I understood is there are two different line of thinking: the one that suggest to enable all of them and the one that suggest to don't use any of them. No middle way. And with all this there are also the Splash settings.......... :roll:

User avatar
vivan
Expert User
Posts: 193
Joined: Sat Dec 19, 2009 10:56 am
PC Specification: Acer Aspire TimelineX 3830TG
Location: St. Petersburg, Russia

Re: Best settings & decoder

Tue Sep 27, 2011 6:09 pm

P.J wrote:Most of the videos are interlaced
I have interlaced video only from my camera.
All rips are progressive since most of BD too (except "LSD-driven Japanese companies", that are using hard telecine = 30i video that is 24p). So... where do they came from? Recording from some TV channels (but all shows are still progessive)? I can't count them as "most of the video".
P.J wrote:and the most of computers don't have powerful CPU.
... but have powerful GPU's, described in your link? :D
P.J wrote:When you have to use GPU, you can't run de-interlacing on CPU.
Nope. With Intel QuickSync you can do any postprocessing you want on CPU while video decodining using GPU. With nVidia, LAV CUVID (or CoreAVC) and CUDA you can do the same thing, if you add ffshow as filter for raw video. Poor ati... again :D
E.g. debanding: ffdshow deband + madVR, splash. Deinterlacing is just one another postprocessing filter.
P.J wrote:Software decoding isn't right way
It's right way for desktops and notebooks when you don't care about battery life. E.g. my desktop can decode 1080p video at about 400fps, that is more than 8 times faster then my ati card (it even can't handle 1080p@50fps). That has a lot of advantages: much faster seeking, forgetting about profile@level stuff, 10 bit support and so on.
Any modern dualcore can easily decode 4K, so 1080p is limit for really slow processors like Atom with 1 core.
P.J wrote:only good for some cases like 1080p60
Btw, with new SB decoder in ffdshow my integrated video can handle 1080p@60fps with 16 ReFrames (High@5.1) decoding perfectly (splash still has artefacts). Obviously, nVidia can too.
P.J wrote:Just look at the CPU usage while decoding 1080
max 10% for 1080p on desktop :D or the same 10% for 720p on my acer.
P.J wrote:Ain't it crazy to burn CPU power for decoding it?
When you do care about battery on notebook - yes. In other situations - no.
But when you care about battery, you don't care about quality.
P.J wrote:Did you forget CrystalHD or your previous netbook, hp mini 311?
Yes. Because it wasn't CrystalHD - it was GeForce 9400M (aka ION, that has slightly faster video decoding block than desktop ati card, lol).
P.J wrote:And madVR is just a codec.
Codec = Coder/decoder.
So madVR isn't a codec. It's a renderer, like Haali or EVR or overlay...

Video has low resolution, so difference is huge. On HQ sources it less noticeable, but still noticeable, especially when it comes to gradients: Splash (it decided to strech it, for some reason), madVR
Also splash doing some extra deblocking that kills a lot of details: splash, madVR.
madVR wirt softcubic50, splash (text is pixilated).
madVR with default lanczos4, splash
P.J wrote:Using it with some programs like MPC-HC is painful.
This is what real pain is :D
Desktop (Intel i7-970, ATI 5870, Windows 7 x64)
Acer Aspire TimelineX 3830TG (Intel i5-2410M, nVidia GT540M, Windows 7 x64)
PS Vita, Nokia N8

ocd
Posts: 41
Joined: Wed Aug 10, 2011 4:56 am
PC Specification: Core 2 Duo + GeForce 210 + Windows 10

Re: Best settings & decoder

Fri Sep 30, 2011 10:48 am

Thanks PJ. Knight: What I meant was, if your card has the power to decode the new codecs, h264 etc, you will do fine. Although you will suffer from color banding which is an all new issue for me, in my other thread. Not a fault of Splash. Nvidia need to get their thinkers out of their stinkers.
Have a good one.

Knight
Posts: 12
Joined: Tue May 24, 2011 7:11 pm
PC Specification: Don't remember... :-D

Re: Best settings & decoder

Fri Sep 30, 2011 8:04 pm

ocd wrote:Thanks PJ. Knight: What I meant was, if your card has the power to decode the new codecs, h264 etc, you will do fine. Although you will suffer from color banding which is an all new issue for me, in my other thread. Not a fault of Splash. Nvidia need to get their thinkers out of their stinkers.
Have a good one.
Yes I got it, but in the end I was/am concerning about the filters/effects/settings: use the one of splash? the one of ati? None of them?

User avatar
P.J
Posts: 300
Joined: Mon Jun 27, 2011 7:25 pm
PC Specification: i5-4460, 16GB RAM, GTX 960, Win10 x64

Re: Best settings & decoder

Sat Oct 01, 2011 1:53 am

Yes, Splash isn't suitable for crappy x264 rips. Anyway, I don't know why you are here :D
Sorry that couldn't answer to all because I prefer to spend my time on important topics...


Knight asked only 2 questions:

1) There's no best setting because it depends on many things,
source, monitor, graphic card (cpu power for software decoding) and your eyes 8-)
I recommend you to try all of the settings by yourself if you have time.

2) madVR is different story and it's not possible. It's Splash ;)

Hope you got your answers =)

User avatar
vivan
Expert User
Posts: 193
Joined: Sat Dec 19, 2009 10:56 am
PC Specification: Acer Aspire TimelineX 3830TG
Location: St. Petersburg, Russia

Re: Best settings & decoder

Sat Oct 01, 2011 12:07 pm

P.J wrote:Yes, Splash isn't suitable for crappy x264 rips.
It's 10 bit video, that is highest quality 720p rip available. And calling it crap... lol.
P.J wrote:Anyway, I don't know why you are here :D
Because somebody is misleading ;)
Desktop (Intel i7-970, ATI 5870, Windows 7 x64)
Acer Aspire TimelineX 3830TG (Intel i5-2410M, nVidia GT540M, Windows 7 x64)
PS Vita, Nokia N8

User avatar
P.J
Posts: 300
Joined: Mon Jun 27, 2011 7:25 pm
PC Specification: i5-4460, 16GB RAM, GTX 960, Win10 x64

Re: Best settings & decoder

Sat Oct 01, 2011 12:51 pm

Ok, I'm a newbie user. Tell me how to install madVR or else to get the best result?

It's easy for me to install and use Splash, only need to run an exe setup file.
Now you tell me. I'm waiting =)

User avatar
vivan
Expert User
Posts: 193
Joined: Sat Dec 19, 2009 10:56 am
PC Specification: Acer Aspire TimelineX 3830TG
Location: St. Petersburg, Russia

Re: Best settings & decoder

Sat Oct 01, 2011 2:18 pm

1) go to google, find madVR thread.
// but if you will type "Splash" in google it would be on third line, not first xD
2) read readme.txt and do what written there - "run install.bat".
// easier than setup.exe, right?
3) switch to it in any supported player (MPC, KMPlayer, PotPlayer, ZoomPlayer or something called J.River Media Center 16).
// one radiobutton in MPC setiings ("output"), one choosing from list in others.
4a) be happy. Also post #3 would be usefull.
// madVR configured to produce best result by default. Plus included decoder supports 10 bit, loseless and also pretty fast.
4b) if you are not happy with software decoding - use this decoder for intel SB videocards (anyway it's much better than DXVA or decoder in splash, since it plays 1080p with 16 reframes without artefacts) or this for nVidia. Ati users could blame ati for doing nothing (but adverting "HD internet", lol).
// they are all free ;)

Anyway, now is really easy to get pretty good result in almost any player without doing anything. Remember time of divx with hundreds of codecs? :lol:
But if you want best result you should do something...
Desktop (Intel i7-970, ATI 5870, Windows 7 x64)
Acer Aspire TimelineX 3830TG (Intel i5-2410M, nVidia GT540M, Windows 7 x64)
PS Vita, Nokia N8

Return to “Splash PRO”

Who is online

Users browsing this forum: No registered users and 6 guests