Thanks, Alan, for jumping in. We've gotten off topic from the first post. I think that it has been noted that the difference between 720p and 1080p is slight, and might not justify the difference in price, especially at normal viewing distances. But, if we're talking minute differences - this may be enough to influence your decision. It's certainly not a bad feeling to be a little future-proofed, however minimally.

Let me also note that I agree that any type of conversion (whether it be de-interlacing or scaling) can introduce errors in the end product.

Filmic vs. Video-like presentation preferences aside, for a moment - can you weigh in on Smokey's point regarding whether a DVD encoded image (480i or 480p for argument's sake) will look better (again, the above preferences aside) than the same signal displayed on a higher resolution monitor? For argument's sake, assume that there are no scaling or de-interlacing errors.

Do you think that the same image will look "softer" or equally as sharp or sharper on a 1080i or 720p set vs. a 480p set?

Working from some of your other comments, though, let me suggest the following... If "upconverting" a 480p signal to a 720p is analogous to "upconverting" a 720p or 1080i signal to 1080p, and you claim that the 1080p picture looks slightly better, then it would also be true (by analogy) that the upconverted image to 720p or 1080i from 480i/p would look similarly and slightly better.
Thus, it would seem that the "theoretical" softness which internal scalers introduce is just that - theory - and that in real life, we actually perceive that extra resolution (even if it is made up of "fake" pixels) as providing more detail, and thus, a sharper image.