I bought a 70" vizio 4k at one of those week of thanksgiving or cyber Monday sales. At first I was a little hesitant because in the past picture quality reviews were always won by Samsung and Sony. But after reading some reviews and being swayed by the super low price I went with the less high-end brand. I'm very well pleased. Picture quality is amazing. And since 4k is an even multiplier of 720p and 1080p, everything looks great on it, even old content. Don't even think about getting an "up-convert" blu-ray player. I cannot imagine what it even does, since all of the pixels fit evenly into 4k. I'll be thrilled at the end of the year when actual 4k blurays are released. Current content is very limited. There are just a few shows on Netflix and amazon, assuming that not too many other people in your neighborhood are sucking up your bandwidth.
Ahhh... not quite. Pixels are not merely duplicated when it is upscaling. The resolution of the source data being a multiple of you device, also doesn't matter other than more pixels is better... period.
Your TV is upscaling if you play anything but 4k that matches your screen. In a worst case, when upscaling the device is using a linear interpolation to get pixels "between" the original source content. However, linear interpolation is trivial these days and is not used... any dork could do this calculation.
There are many interpolation algorithms and just as many filters that are applied to the picture to produce a higher resolution image on the 4k device, given lessor source data.
These processes are as good as the engineers and time put into them and IMO, in the consumer market no one beats sony TV's on the motion and upscaling processes. However, I will say that the Samsung 4k that I used did a good job.
Good upscaling will also reduce many aliasing artifacts that you see on systems, especially computer games. Some of these processes will result in game latency and also in audio/video sync issues. Most sets will have compensation for audio sync.
There are also some very fine devices out there that specialize in upscaling and these are even better than any TV. In fact that are some that would cost even more than a consumer tv. If the processes on the device and tv complement each other, having both can be better and this is more than likely the case, but I think it is possible to see some artifacts of this... just have never tried to analyze or measure it.
EDIT: And of course if you really care you need to know how the content provider and device compress and decompress the audio and video. I can't help you there, no need for me to learn this yet, but likely will have too soon.