I went through a similar conundrum when researching for a new display. Here is what I interpreted my findings on the technologies:

1) your source is likely only outputting at 50hz or 60hz at most (unless you output from PC). Back in the day where a 32" CRT TV was considered "huge" and the picture quality mediocre by today's standard, you didn't notice motion blur as, lets face it, the image was blurry to begin with.

2) there is no LED/OLED TV out there with 240hz refresh rates. Most 4k UHD or SUHD are at 120hz with some sort of graphic chips to enhance to "240hz".

So, back to motion blur and why it happens. With bigger TV's in the 60"-70" range now, there is a lot more image for your brain to notice flaws. With a 60hz source (and yes, even 4k broadcasts will only have a 60hz refresh rate for all 4k pixels), your brain has a much bigger picture to notice the time delay between refreshed images. Your brain then tries to fill in the gaps with what it thinks should be happening between the gaps and creates the blur (it's not actually the TV). If you could actually pause your TV for each frame, you would be amazed at how super sharp each image is and wonder where the blur is. The answer is really you are the source of the blur. Also, with sports, you have two reasons why blur is more likely than with a movie/tv show... the camera is moving to follow a moving object...

Ok, so there are two technologies to help fool our brain, frame interpolation and black frame insert (bfi). Frame interpolation is basically inserting an extra frame (or 2 or 4) in the 60hz feed to create a 120hz or 240hz refresh. The problem is at how well and fast the chip can create the fake frame. BFI instead of creating a fake frame, inserts a completely black frame to prevent/trick your brain from making assumptions. In both cases, if the chip gets it wrong, your brain will see the error or think the image is flickering.

So here is the kicker... you get what you pay for in most cases. The better graphic chips cost more and the cheaper TV's cut corners. Also, you need to know if your brain responds better to interlace or BFI or a combination of both.

Generally speaking, tests seem to indicate that Samsung and LG have better graphic chips dealing with the issue. I personally settled with an 120hz, 65" 4K UHD LG model after first trying out a similar priced Samsung model. What I did notice is that my wife's brain and mine must interpret motion differently as she remarked that the Samsung appeared sharper during hockey to her. I believe Samsung uses a combo interlace/bfi favouring more bfi. For this reason, I found the image less bright. LG also uses a combo motion blur technique but it must favour more interlace and have better predictive algorithms so the image appears crisper/brighter to me. Since I watch more sports than she does, my opinioned mattered most (honestly, I think should could have cared less).

Final note, you need to turn off motion enhancement when watching movies or else the image is too good, looking unnatural!

Edit: Oh, forgot to mention why your Plasma appeared to be sharper... its related to how that technology works. Essentially, Plasma's work like bfi where it takes 8 to 10 rapidly on/off cycles (and why they "claim" 480hz (8x60hz) and 600hz (10x60hz) refresh rates) to create 1 image. For this reason, Plasmas use way more energy to create the same 1 frame image but the superior bfi makes it appear motion is more fluid. Unfortunately, expensive to create in the larger 60+" models and energy hogs.


Last edited by TDIPablo; 07/19/17 03:23 PM.