|
Re: A view HLSL questions and just wonder, what happened to this.
01/22/15 07:01 PM
|
|
|
> Short answer is no, that, specifically, wasn't implemented in MAME. MY understanding > is that RobeeJ's code was software-based (CPU). Whereas MooglyGuy's was implemented > in HLSL (GPU). I'm not going to tell you how to configure your system, but you > shouldn't need much bloom to simulate a vector monitor. At higher bloom levels it > doens't look good; vector or raster.
I am aware of this CPU/GPU thing, thats why RobeeJ hoped that someone could "translate" his code to a directX shader. I know a person, who could probably do this and thats why I am asking. It wouldnt make sense, if it is allready in MAME. Reading the topic, it looked like that many people liked his approach (including mogli).
I know that high Bloom settings, dont look nice, but high Raster Bloom still looks "natural", it is just overextended. Vector Bloom with slightly higher settings, creates "JPEG" like artefacts and I just wonder why.
> > That is not 100% correct. You need SDLMame, but it runs on windows.
Ok, I am dumb and dont know the difference. Has SDLMame any downsides? I read in GroovyMAME posts, that it can create higher input-lags, but maybe I just dont understand it. I just would like to test this shader (but if possible not on Linux), thats all.
cheers u-man
|
|