Answer: Because it's an irrelevant number.
In the beginning days of computers and printing you basically specified the print size by giving the computer a file x by y, and a DPI, and it would spit out the print. That's not a very efficient way for printers to manage print size. Welcome to post 1995, printers don't care any more. Your printing service or printing program will resize whatever DPI you give it to match the printer anyways. Most printers that you'd use on a photograph are 1200 DPI or higher, but even that is misleading. Each dot is a specific color, Cyan, Magenta, Yellow, or Black (CMYK). They combine to make a color. Higher end printers use even more inks to offer more variations, or rather smoother variations. The dots are not all a uniform size, so there is overlap as well. And the printing program lays out the ink dots with the yellow on the bottom and the black on the top. My point is that it's a very complicated procedure that computer, software, and printer manufacturers have down to a science. It's also something that you can't affect very much. Change your DPI all day long and your 1000 pixel wide print set to 2 inches will print at something similar to 500 DPI, even when you set the file to 72 DPI, 300 DPI, 5,000,000 DPI, or even 10 DPI. And even that 500 DPI print isn't printing at 500 DPI, it's printing at whatever native DPI the printer prints at. It's physically impossible for it to do it any other way.
That isn't to say that if you set the program to save at 300 DPI that it shouldn't save at 300 DPI. But in the big scheme of things it's such an insignificant part of the whole picture. I really wouldn't get bent out of shape over it.
You'll also run across services that you need to send files too, and they'll specify ___ DPI. Most of the time it's just easier to comply than try to argue with them, but frankly if they can't figure out how to use a file with a different DPI than what you supply they need to do some retraining. Either that or they have some archaic software/hardware that is just fickle, but in that case they really should know how to work around it. Imagine taking your car to a mechanic and they say "sorry, can't work on this car. The spedometer says kilometers per hour and not miles per hour". You'd be wondering if the mechanic actually knew what they were doing. It's much the same thing here. If you're driving at 80 kilometers per hour how much faster are you going to get to your destination than a car driving at 50 miles per hour? Not by much.
As for webviewing, your OS and browser will rescale any DPI to 72 DPI if the html is set to allow full sized images. If it's not, and there's a size specified, it will rescale the image to preview at a given size on the screen. The image will still be 72 DPI, but smaller or larger to meet the physical size. This happens no matter what DPI you set the image too. But even that is misleading. Most monitors don't display at 72 DPI, most display higher but some display lower. So the driver for your monitor in your OS converts the image that your OS converted to 72 DPI to match whatever DPI your monitor displays at. Look at these two images

Which one's bigger? What's the difference?
One is set to 1000 DPI, the other is set to 10 DPI. Can you really say there's a difference?
My point is.... quit worrying about it. It's irrelevant. As long as the pixel count doesn't change you haven't had any destructive affects. Yes I want Corel to fix it, but I'm not going to jump up and down about it either.
