It'll still appear on Acer/Dell/Lenovo towers for decades to come, have no fear. The fact that the default output on a lot of those is still VGA, which is as old as me, is what makes me believe that. As for replacing monitors.. Eh. DVI still exists in the DVI-HDMI-DP cable market, so I couldn't say I think it'd be worth it. Unless you're after 4k.
Is DVI dead? Yes. Do you need to worry about it being dead? Not really. DVI is obsolete tech as it has insufficient bandwidth for high refresh rate / high res monitors. However, since you have a 1080p 60 hz monitor DVI is and will remain perfectly sufficient. And hdmi - dvi / dp - dvi adapters are dirt cheap, so it going away on GPUs doesn't affect you either. In less words: Until you choose to upgrade the monitor to one that needs more bandwidth than can be handled via DVI the death of DVI will be a non issue.
I've got a 1200p monitor with only DVI-D and VGA, and the RTX 2080 currently resting in my account desktop only has a bunch of DisplayPorts and HDMIs (plus a USB Type-C) - but it came with a DP-to-DVI-D adapter in the box, which works fine.
Only issue with DVI actually being removed nearly a decade after it was declared obsolete and 4 years after it was not supposed on anything is for people, who have weird 120/144Hz monitors with DVI dual link only. Everyone else can just use a HDMI to DVI single link cable or DP/HDMI in their native format.
HDMI to DVI adaptors or cables are dirt cheap. My monitor has VGA, DVI and DP - the DP is taken by my PC, so when I need to plug in my work laptop which is HDMI out only, I just use an HDMI to DVI cable and it works perfectly.
DVI isn't dead, it just smells funny... DVI is broadly speaking the same electrically as the video component of HDMI, so as long as the card has HDMI, a HDMI-DVI cable/adapter should allow a (single link) DVI monitor to function as if were using DVI-DVI with no conversion.
Do you really need a new monitor? You can pick up a dp - dvi adapter for like a fiver inc next day shipping. As for that monitor you linked, 4K is a huge commitment in terms of GPU power required for gaming, plus there is the question of refresh rate, that is only a 60hz screen (like 99.9% of all 4K monitors), the question that will need asking is if you would potentially get more out of a monitor upgrade by opting for a 1080p / 1440p 144hz one. Unfortunately there isn't really a right or wrong answer to it so it can be hard to make definitive recommendation either way. As for Freesync / G-Sync: It needs to be supported by both the GPU and monitor to enable it, freesync can work over dp or hdmi, G-Sync only over dp.
I believe you need Displayport for Freesync although some monitors (and some TVs IIRC) can run it over HDMI (apparently mine can). As for Gsync, no idea. Edit: Ninja'd by @Anfield
I'd have done it the way you have. When I first got a 4k monitor, I had a GTX970 to feed it a signal, and as long as you drop some settings, you can get playable frame rates and all the benefits of the high resolution monitor.
While obviously not ideal, running games at 1080p won't look too bad as the scaling should work (unlike say from 1080p to 1440p where its a lot more problematic), so not all that bad, it may just take a long time until you can make full use of the monitor.