Rev 9/20/21
While looking into whether Telarc's early 50 kHz digital recordings have been digitally converted to CD-grade digital by using an asynchronous sample-rate converter (which became available in about 2006), I ran across a 2004 review of Telarc hybrid SACD-60634 (Saint-SaĆ«ns, Symphony No. 3 “Organ”, Eugene Ormandy, Philadelphia Orchestra) which was made by converting the 50 kHz master to DSD using a dCS 972 digital-format converter, then to analog, and then to CD-grade PCM using a custom Telarc ADC. The article indicates that they had tried using a sample-rate converter to convert directly from 50 kHz to 44.1 kHz, but that the results didn't sound very good due to problems with sample-rate converters at that time. The Wikipedia article on Soundstream provides detailed information on the digital recorder which Telarc used for making their 50 kHz recordings, which could be released as 50 kHz FLACs to minimize the number of sample-rate conversions which it would undergo in the process of being converted to analog.
I've also learned that someone developed a way to convert 48 kHz recordings to 44.1 kHz very early in the digital era, and that Decca released a lot of its 48 kHz recordings on a CD-collection known as The Decca Sound, which according to The Decca Sound by S. Andrea Sundaram, doesn't sound so great, although it's not clear exactly why.
So, there are probably other examples of early digital recordings, with sampling rates other than 44.1 kHz, which were transferred to CDs early in the CD-era. Such recordings were released as LPs, so there had to be decent DACs to convert them to analog, which could have been digitized with something like a JVC VP-900, an oversampling ADC with a 16/44.1 output, apparently introduced in 1982. So, it would have had 16-bit linearity and a linear phase characteristic, so that its recordings would have good low-level detail, good high-end detail, and good imaging, but it was expensive. In September of 1985, Apogee introduced its linear-phase aftermarket input filters for the typical early digital recorder, and recording engineers adopted them in droves as quickly as possible, so that most digital recorders soon had a clean high end and good imaging. But before then, a lot of digital recordings had poor detail and imaging.
However, mass-market CD players in general were lousy until at least 2010, due to the sound quality of low-cost audio-DAC chips. According to Benchmark's app note entitled A Look Inside the New ES9028PRO Converter Chip and the New DAC3 [November 14, 2016]:
"It has been a little over 7 years since ESS Technology introduced the revolutionary ES9018 audio D/A converter chip. This converter delivered a major improvement in audio conversion and, for 7 years, it has held its position as the highest performing audio D/A converter chip. But a new D/A chip has now claimed this top position. Curiously the successor did not come from a competing company; it came from ESS. On October 19, 2016, ESS Technology announced the all-new ES9028PRO 32-bit audio D/A converter. In our opinion, ESS is now two steps ahead of the competition!"
So, I gather that the 9018 Sabre DAC introduced in 2010 was the first really good audio-DAC chip, and it would have been too expensive at that time to put in mass-market players. But now there are many good inexpensive audio DAC-chips (although Benchmark is still partial to Sabre DACs), and Sabre DACs are appearing in low-cost players and DACs. I have a $100 2017 Nobsound Bluetooth 4.2 Lossless Player with a 9018 Sabre DAC, and it's amazing, although it's crude compared to Benchmark's DACs.
But due to piracy fears, the best versions of some albums are reserved for high-res streaming and LPs. According to various audio experts, including high-res expert Mark Waldrep, PhD (a.k.a. Dr. AIX), whose website is RealHD-audio.com, high-res recordings and LPs sound better than CDs because they're mixed and mastered better, and not because of the recording format. So, CDs supposedly could sound as good as high-res or LPs, if they were mixed and mastered as well, and the low-level detail could be kept out of the dither-region, where it is mixed with noise which is intended to mask the severe low-level distortion of 16-bit digital. There was a period in CD-history known as the "loudness wars," when CDs were recorded at the highest possible level because they would sell better, perhaps because high recording levels kept the low-level details out of the dither-region. Unfortunately, this approach required excessive compression and might have led to clipping.