I think it's maybe time for game developers to spend more effort/time on gameplay and a bit less on AAA+++ graphics that only top GPUs can handle. Maybe hard targeting iGPU/APU for mid-level support in games.
For that matter, plenty of room to remake/reskin older (fun) games... Bring the Black Mesa effect to lots of existing games.
Games now don’t even look good enough to justify the enormous computing power they require. They’re a stuttery and blurry mess, augmented by fake AI frames, LOD popping, weird screen space rendering defects, and so on. Graphics peaked 10 years ago.
After I read this article, I realized that I have a huge backlog of unplayed/partially played games in my steam library. Significant chunk of those games are from Indie devs. Probably enough games to last me a year or two. Plus, most of them seem like they will run just fine at max to high settings using my 3080 Ti on a 1440p display. Point is, I really need to stop looking for the next hardware refresh so I can keep playing newer big budget titles. This and current memory prices made it easy for me to forget about upgrading for the next 2-3 years.
Nvidia is chasing trends again, they did it for crypto mining, and I bet some new fad will come up soon. It looks like the gaming industry needs to move to something else than the current GPU ecosystem in the long term. Intel Arc and Moore Threads have tried, but we really need a new way of designing and rendering graphics.
before (what? when? covid?) nvidia annual revenues were circa us$10b and amd was competing with it to make GPUs. after (what? when? covid?)..or now.. nvidia is a accelerator manufacturer with a small legacy GPU business and revenues are closer to us$200b. and amd is still no closer to competing with nvidia at making GPUs. intel started something in the mean time though. but everyone seems to be expecting them to try compete with nvidia at making accelerators. nobody is interested in GPUs anymore. accelerators seem to be the much bigger and more interesting market.
I do feel there might be a day of reckoning where Nvidia bet the farm too hard on this AI bubble and it ends up blowing up in their face.
I hope gamers, systems integrators, and regular PC enthusiasts don't have memories of goldfish and go back to business as usual. It needs to hurt Nvidia in the pocketbook.
Will this happen? Unlikely, but hope springs eternal.
NVidias share price will take a hit when consolidation starts in AI, because their business won't be growing as fast as their PE ratio implies. Also the circular deals could hurt them if one of the AI providers they've invested in goes bust.[1],[2]. They won't go out of business but holders of their shares may lose lots of money. But will this happen after Anthropic and OpenAI have their IPOs, possibly next year? NVidia stands to make a lot on paper if those IPOs do well.
If OpenAI has their IPO, this is likely going to result in retail getting fleeced, given how their return on their investments to date has been absolutely pitiful. They are seeing revenues of around $13 billion for 2025, with an alleged over $100 billion or more by 2030, but the investments they are making are orders of magnitude greater. Who is ultimately going to pay for this?
Surely OpenAI has customers buying their pro packages for ChatGPT, but that can't really be it. And businesses are starting to realize that AI can't replace the workforce that easily either.
Hardly taking this personally. Just calling out how I see it going most likely. Also... Nvidia has done quite a bit unethically. Namely violating anti-monopoly laws (though with our current US administration - they may as well be not worth the paper they are printed on), screwing with product reviewers, pulling a 90s-era Microsoft to obliterate their competition at all costs, and screwing over their board partners, like EVGA. GamersNexus on Youtube has covered plenty of this.
That said, although AI has some uniquely good applications, this AI mania is feeding into some ridiculous corporate feedback loop that is having a negative impact on the consumer.
Having to pay several thousands of dollars for a top tier consumer GeForce when it was possible to do the same with only a few hundred dollars less than a decade ago is telling me the customer is being taken for a ride. It stinks.
I don't get this. Nvidia didn't "bet the farm" on AI. They are simply allocating limited resources (in this case memory) to their most profitable products. Yes, it sucks for gamers, but I see Nvidia more reacting to the current marketplace than driving that change.
If/when the AI bubble bursts, Nvidia will just readjust their resource allocation accordingly.
I also don't understand common sentiment that if/when the AI bubble pops and hardware manufacturers come crawling back, we consumers are going to make manufacturers regret their decision.
Isn't the whole problem that all the manufacturers are pivoting away from consumers and toward AI? How are we going to "hurt Nvidia in the pocketbook?" Buy from their competitors? But they are also making these pivots/"turning their backs on us." Just abstain from buying hardware out of protest? As soon as prices go down there's gonna be a buying frenzy from everyone who's been waiting this whole time.
If/when the bubble pops, manufacturers will find that they can't butter their bread like they could when the datacenter craze was booming. In a world that is paved by growth, companies aren't very good at shrinking.
It doesn't matter what consumers do or don't do -- we plebians are a tiny portion of their present market. We can buy the same GPUs from the same folks as before, or we can do something different, and it won't matter.
Whatever we do will be a rounding error in the jagged, gaping, infected hole where the AI market once was.
This is an even-handed take. I still think consumers in general should vote with their wallets, even if all of them put together won't hold a candle to their datacenter customers. If nothing else, it can grant the competition more market share, and maybe AMD and Intel can invest more into Radeon and Arc, respectively. That can only be a good thing, since I'd love to see more broad support for FSR and XeSS technologies on games, and ROCm and oneAPI for compute.
Oh, for sure. It's often good to bet on the underdog in a competitive market -- it helps ensure that competition continues to exist.
When I sold PC hardware, I'd try to find the right fit for a customer's needs and pricepoint. Way back then, that often meant selling systems with relatively-inexpensive Cyrix or AMD CPUs and more RAM instead of systems with more-expensive Intel CPUs that had less RAM at any given price -- because those were good tradeoffs to make. By extension, I did a very small part to help foster competition.
But gamers drive the bulk of non-datacenter GPU sales and they don't necessarily act that way.
Having observed their behavior for decades, I feel confident in saying that they broadly promote whatever the top dog is today (whether they can afford to be in that club or not), and aren't shy about punching down on those who suggest a less-performant option regardless of its fitness for a particular purpose.
Or at least: The ones who behave this way sure do manage to be loud about it. (And in propaganda, loudness counts.)
I suspect they'll be fawning over nVidia for as long as nVidia keeps producing what is perceived to be the fastest thing, even if it is made from pure unobtanium.
I had one of those for what seemed like an eternity.
At first, right out of the gate: I overclocked it from 300MHz to 350MHz just to see what would happen. It worked perfectly without further adjustment (and the next step did not), so I left it right there at 350MHz. For the price, at that time, it kept up great compared to what my peers had going on.
As the years ticked by and it was getting long in the tooth, it stayed around -- but it shifted roles.
I think the last thing it was doing for me was running a $25 SoundBlaster Live! 5.1's EMU10k1 DSP chip under Windows, using the kX audio drivers.
kX let a person use that DSP chip for what it was -- an audio-oriented DSP with some audio-centric IO. With kX, a person could drop basic DSP blocks into the GUI and wire them together arbitrarily, and also wire them into the real world.
I used it as a parametric EQ and active crossover for the stereo in my home office -- unless I was also using it as a bass preamp, in a different mode. Low-latency real-time software DSP was mostly a non-starter at that time, but these functions and routings were all done within the EMU10k1 and end-to-end latency was low enough to play a bass guitar through.
Of course: It still required a computer to run it, and I had a new family at that time and things like the electric bill were very important to me. So I underclocked and undervolted the K6-2 for passive cooling, booted Windows from a CompactFlash card (what spinning HDD?), and hacked the power supply fan to just-barely turn and rotate air over the heatsinks.
It went from a relatively high-cost past-performer to a rather silent low-power rig that I'd remote into over the LAN to wiggle DSP settings on that only had one moving part.
Neat chips, the K6-2 and EMU10k1 were.
Fun times.
(And to bring it all back 'round: We'd be in a different place right now if things like the K6-2 had been more popular than they were. I don't know if it'd be better or worse, but it'd sure be different.)
Dude seriously this is such a nice story. I especially love how you used the EMU10k1 DSP in conjunction with your K6 system to its fullest potential. :D
Speaking of sound cards, I distinctly remember the Sound Blaster Audigy being the very last discrete sound card my dad obtained before we stuck with AC’97, and later the HDA codec audio solution on the motherboard.
I do vaguely recall the kX drivers you mentioned, but I’m pretty sure we stuck with whatever came stock from Creative Labs, for better or for worse. Also… that SB16 emulation under DOS for the Live! and Audigy series cards was not great, having been a carry over from the ENSONIQ days. The fact that I needed EMM386 to use it was a bit of a buzzkill.
On the K6-II+ system we had, we used an AWE64 Gold on the good ol’ ISA bus. Probably my favorite sound card of all time, followed by the Aureal Vortex 2.
Sound cards were cool. Kids these days with their approximately-perfect high-res DACs built into their $12 Apple headphone adapters don't know what it was like. ;)
My mom had a computer with a SoundBlaster 16. I carried that sound card across the room one day for whatever reason a kid does a thing like that, and it got zapped pretty bad with static. It still worked after that, but it learned the strangest new function: It became microphonic. You could shout into the sound card and hear it through the speakers.
But other than being microphonic, the noise wasn't unusual: Sound cards were noisy.
At one point around the turn of the century, I scored a YMF724-based card that featured an ADC stage that actually sounded good, and was quiet. I used this with a FreeBSD box along with a dedicated radio tuner to record some radio shows that I liked. That machine wasn't fast enough to encode decent MP3s in real-time, but it was quick enough to dump PCM audio through a FIFO and onto the hard drive without skipping a beat. MP3 encoding happened later -- asynchronously. It was all scheduled with cron jobs, and with NTP the start times were dead-nuts on. (Sometimes, there'd be 2 or 3 nice'd LAME processes stacked up and running at once. FreeBSD didn't care. It was also routing packets for the multi-link PPP dialup Internet connection at the house, rendering print jobs for a fickle Alps MD-1000 printer, and doing whatever else I tossed at it.)
I used 4front's OSS drivers to get there, which was amusing: IIRC, YMF724 support was an extra-cost item. And I was bothered by this because I'd already paid for it once, for Linux. I complained about that to nobody in particular on IRC, and some rando appeared, asked me what features I wanted for the FreeBSD driver, and they sent me a license file that just worked not two minutes later. "I know the hash they use," they said.
There's a few other memorable cards that I had at various points. I had a CT3670, which was an ISA SoundBlaster with an EMU 8k that had two 30-pin SIMM sockets on it for sample RAM.
There was the Zoltrix Nightingale, which was a CMI8738-based device that was $15 brand new (plus another $12 or something for the optional toslink breakout bracket). The analog bits sounded like crap and it had no bespoke synth or other wizardry, but it had bit-perfect digital IO and a pass-through mode that worked as an SCMS stripper. It was both a wonderful and very shitty sound card, notable mostly because of this contrast.
I've got an Audigy 2 ZS here. I think that may represent the pinnacle of the EMU10k1/10k2 era. (And I'm not an avid gear hoarder, so while I may elect to keep that around forever, it's also likely to be the very last sound card I'll ever own.)
And these days, of course, things are different -- but they're also the same. On my desk at home is a Biamp Tesira. It's a fairly serious rackmount DSP that's meant for conference rooms and convention centers and such, with a dozen balanced inputs and 8 balanced outputs, and this one also has Dante for networked audio. It's got a USB port on it that shows up in Linux as a 2-channel sound card. In practice, it just does the same things that I used the K6-2/EMU10k1/kX machine for: An active crossover, some EQ, and whatever weird DSP creations I feel like doodling up.
But it can do some neat stuff, like: This stereo doesn't have a loudness control, and I decided that it should have something like that. So I had the bot help write a Python script that watches the hardware volume control that I've attached and assigned, computes Fletcher-Munson/ISO 226 equal-loudness curves, and shoves the results into an EQ block in a fashion that is as real-time as the Tesira's rather slow IP control channel will allow.
Holy cow. Again, kudos for the details. This has been a fantastic digression so far lol.
So I do strongly remember Sound Blaster cards, specifically of the SB16 variety, being jokingly referred to as “Noise Blasters” for quite some time, due to the horrible noise floor they had as well as all the hiss. One of the reasons I loved the AWE64 Gold was because Creative did manage to get that well under control by that point, along with other fixes introduced with DSP 4.16. I still have an AWE64 Gold in my collection, complete with the SPDIF bracket, that I will never sell, due to sentimental reasons.
The YMF724 card you mentioned… did that happen to have coaxial SPDIF perchance? I heard that, unlike the SPDIF implementation found on the AWE series cards from Creative, the YMF724 SPDIF carried all audio over it, even under DOS. Not just 44.1 kHz specific sound, which I believe Creative sourced from the EMU8k. Plus, as an added bonus, if your motherboard offered SBLINK (also known as PC/PCI), you could interface with the PCI sound card interrupts directly in DOS without memory-hogging TSRs.
As for my final sound card I ever owned before abandoning them, mine was the rather unique ESI Juli@ back in the 2011/2012 timeframe. I loved how the audio ports had a zany breakout cable for MIDI and RCA features, as well as the board that could flip around for different style jacks.
One other remark that leads to a question. Linux users back in the day had a penchant for choosing one audio API over the other in Linux, like ALSA, OSS, or PulseAudio. Did you play around much with these in the dog days of Linux?
For the YMF724: I really don't remember that part of it, but I'd like to think that if it had SPDIF built out that I really would have paid attention to that detail. The only reason I went through the horrors of using the cheap-at-every-expense CMI8738 Zoltrix card was to get SPDIF to feed an external DAC (and finally live in silence), and if the YMF724 I had included it then my memories would be shaped differently. :)
And I'm usually pretty good with model numbers, but it's possible that this card really didn't have one. Back then, I got a lot of hardware from an amazing shop that sold things in literal white boxes -- stuff that they'd buy in bulk from Taiwan or wherever and stock on the shelves in simple white boxes with a card (in a static bag) inside. No book, no driver disk.
These boxes had a description literally pasted onto them; sometimes black-and-white, and sometimes copied on one of those fancy new color copiers, sometimes with jumper settings if appropriate -- and sometimes without. Some of the parts were name-brand (I bought a Diamond SpeedStar V330 from there with its minty nVidia Riva128 -- that one had a color label), but other times they were approximately as generic as anything could ever be.
Or, I'd pick up stuff even cheaper from the Dayton Hamvention. There were huge quantities of astoundingly-cheap computer parts of questionable origin moving through that show.
But no, no SPDIF on that device that I recall. It may have been on the board as a JST or something, but if it was then I absolutely never used it.
I do remember that bit about the EMU8k's SPDIF output -- my CT3670 had that, too. IIRC it was TTL-level and not galvanically-isolated or protected in any way, on a 2-pin 0.1" header. IIRC, it didn't even have the 75 Ohm terminating resistor that should have been there. I was disappointed by the fact that it only output audio data from the EMU8k, since that part didn't handle PCM audio.
But! There was a software project way back then that abused the EMU8k to do it anyway: Load up the sample RAM with some PCM, and play it. Repeat over and over again with just the right timing (loading samples in advance, and clearing the ones that have been used), give it a device name, and bingo-bango: A person can play a high-latency MP3 over SPDIF on their SoundBlaster AWE-equivalent.
I was never able to make it work, but I sure did admire the hack value. :)
That ESI Juli@ is an a very clever bit of kit and I've not ever seen one before. I'm a bit in awe of the flexibility of it; the flip-card business is brilliant. There's got to be applications where that kind of thing could be used in the resurgent analog synth world.
It's very different, but for some reason it reminds me of the Lexicon Core 2 we used in a studio from 1999 until 2002 or so. This had its (sadly unbalanced) 4 inputs and 8 outputs on an external breakout box, and we gave it another 8 channels in each direction by plugging it into an ADAT machine. That was an odd configuration, and bouncing through the hardware reverb on the card was even odder.
The Core 2 did not work with the then-cutting-edge Athlon box we built for it and that was a real bummer -- we spent a lot of money on that rig, and I spent a ton of time troubleshooting it before giving up. (We then spent a lot more money replacing that board with a slotted Pentium 3.)
ALSA, OSS, PulseAudio: Yeah, all of those. I paid for OSS fairly early on, and that was also always very simple to make work -- and it did work great as long as a person only did one thing at a time. I really enjoyed the flexibility of ALSA -- it let me plug in things like software mixers, so I could hear a "ding" while I was playing an MP3. And I liked the network transparency of PulseAudio ("it's kind of like X, but for sound!") but nobody else really seemed interested in that aspect around that time.
If I had to pick just one as a favorite, it would definitely be OSS: The concept of one sound card with exactly one program that completely owned the hardware until it was done with it allowed for some very precise dealings, just like with MS-DOS. It felt familiar, plain, and robust.
I certainly have no delusions of Nvidia going bankrupt. In fact, they will certainly make it to the other side without much issue. That said, I do foresee Nvidia taking a reputational hit, with AMD and (possibly) Intel gaining more mindshare among consumers.
I think it's maybe time for game developers to spend more effort/time on gameplay and a bit less on AAA+++ graphics that only top GPUs can handle. Maybe hard targeting iGPU/APU for mid-level support in games.
For that matter, plenty of room to remake/reskin older (fun) games... Bring the Black Mesa effect to lots of existing games.
Games now don’t even look good enough to justify the enormous computing power they require. They’re a stuttery and blurry mess, augmented by fake AI frames, LOD popping, weird screen space rendering defects, and so on. Graphics peaked 10 years ago.
After I read this article, I realized that I have a huge backlog of unplayed/partially played games in my steam library. Significant chunk of those games are from Indie devs. Probably enough games to last me a year or two. Plus, most of them seem like they will run just fine at max to high settings using my 3080 Ti on a 1440p display. Point is, I really need to stop looking for the next hardware refresh so I can keep playing newer big budget titles. This and current memory prices made it easy for me to forget about upgrading for the next 2-3 years.
Nvidia is chasing trends again, they did it for crypto mining, and I bet some new fad will come up soon. It looks like the gaming industry needs to move to something else than the current GPU ecosystem in the long term. Intel Arc and Moore Threads have tried, but we really need a new way of designing and rendering graphics.
Or nvidia could be broken up and different companies could focus on consumer and enterprise divisions.
I'm sure half their employees will love being removed from the AI profit train.
I am starting to find it funny. Even if it is sad... Well thankfully Finnish has term for this surkuhupaisa...
Nag screen
https://archive.ph/dJt9D
Gamers be damned, they are going to keep those prices high.
before (what? when? covid?) nvidia annual revenues were circa us$10b and amd was competing with it to make GPUs. after (what? when? covid?)..or now.. nvidia is a accelerator manufacturer with a small legacy GPU business and revenues are closer to us$200b. and amd is still no closer to competing with nvidia at making GPUs. intel started something in the mean time though. but everyone seems to be expecting them to try compete with nvidia at making accelerators. nobody is interested in GPUs anymore. accelerators seem to be the much bigger and more interesting market.
Gamers should be glad they have google and openai funding the development of their GPUs
I do feel there might be a day of reckoning where Nvidia bet the farm too hard on this AI bubble and it ends up blowing up in their face.
I hope gamers, systems integrators, and regular PC enthusiasts don't have memories of goldfish and go back to business as usual. It needs to hurt Nvidia in the pocketbook.
Will this happen? Unlikely, but hope springs eternal.
NVidias share price will take a hit when consolidation starts in AI, because their business won't be growing as fast as their PE ratio implies. Also the circular deals could hurt them if one of the AI providers they've invested in goes bust.[1],[2]. They won't go out of business but holders of their shares may lose lots of money. But will this happen after Anthropic and OpenAI have their IPOs, possibly next year? NVidia stands to make a lot on paper if those IPOs do well.
[1] https://finance.yahoo.com/news/nvidia-microsoft-back-anthrop...
[2] https://techcrunch.com/2025/10/12/nvidias-ai-empire-a-look-a...
If OpenAI has their IPO, this is likely going to result in retail getting fleeced, given how their return on their investments to date has been absolutely pitiful. They are seeing revenues of around $13 billion for 2025, with an alleged over $100 billion or more by 2030, but the investments they are making are orders of magnitude greater. Who is ultimately going to pay for this?
Surely OpenAI has customers buying their pro packages for ChatGPT, but that can't really be it. And businesses are starting to realize that AI can't replace the workforce that easily either.
Why take this personally. They are a business. It’s one thing if they did something unethical, but I don’t think this is grudge worthy.
If anything this shows how small the gaming market really is for them. This opens the playing field for new companies to capture.
I find it admirable that they are able to intensify focus on an area they see the most value.
Hardly taking this personally. Just calling out how I see it going most likely. Also... Nvidia has done quite a bit unethically. Namely violating anti-monopoly laws (though with our current US administration - they may as well be not worth the paper they are printed on), screwing with product reviewers, pulling a 90s-era Microsoft to obliterate their competition at all costs, and screwing over their board partners, like EVGA. GamersNexus on Youtube has covered plenty of this.
That said, although AI has some uniquely good applications, this AI mania is feeding into some ridiculous corporate feedback loop that is having a negative impact on the consumer.
Having to pay several thousands of dollars for a top tier consumer GeForce when it was possible to do the same with only a few hundred dollars less than a decade ago is telling me the customer is being taken for a ride. It stinks.
I don't get this. Nvidia didn't "bet the farm" on AI. They are simply allocating limited resources (in this case memory) to their most profitable products. Yes, it sucks for gamers, but I see Nvidia more reacting to the current marketplace than driving that change.
If/when the AI bubble bursts, Nvidia will just readjust their resource allocation accordingly.
I also don't understand common sentiment that if/when the AI bubble pops and hardware manufacturers come crawling back, we consumers are going to make manufacturers regret their decision.
Isn't the whole problem that all the manufacturers are pivoting away from consumers and toward AI? How are we going to "hurt Nvidia in the pocketbook?" Buy from their competitors? But they are also making these pivots/"turning their backs on us." Just abstain from buying hardware out of protest? As soon as prices go down there's gonna be a buying frenzy from everyone who's been waiting this whole time.
If/when the bubble pops, manufacturers will find that they can't butter their bread like they could when the datacenter craze was booming. In a world that is paved by growth, companies aren't very good at shrinking.
It doesn't matter what consumers do or don't do -- we plebians are a tiny portion of their present market. We can buy the same GPUs from the same folks as before, or we can do something different, and it won't matter.
Whatever we do will be a rounding error in the jagged, gaping, infected hole where the AI market once was.
This is an even-handed take. I still think consumers in general should vote with their wallets, even if all of them put together won't hold a candle to their datacenter customers. If nothing else, it can grant the competition more market share, and maybe AMD and Intel can invest more into Radeon and Arc, respectively. That can only be a good thing, since I'd love to see more broad support for FSR and XeSS technologies on games, and ROCm and oneAPI for compute.
Oh, for sure. It's often good to bet on the underdog in a competitive market -- it helps ensure that competition continues to exist.
When I sold PC hardware, I'd try to find the right fit for a customer's needs and pricepoint. Way back then, that often meant selling systems with relatively-inexpensive Cyrix or AMD CPUs and more RAM instead of systems with more-expensive Intel CPUs that had less RAM at any given price -- because those were good tradeoffs to make. By extension, I did a very small part to help foster competition.
But gamers drive the bulk of non-datacenter GPU sales and they don't necessarily act that way.
Having observed their behavior for decades, I feel confident in saying that they broadly promote whatever the top dog is today (whether they can afford to be in that club or not), and aren't shy about punching down on those who suggest a less-performant option regardless of its fitness for a particular purpose.
Or at least: The ones who behave this way sure do manage to be loud about it. (And in propaganda, loudness counts.)
I suspect they'll be fawning over nVidia for as long as nVidia keeps producing what is perceived to be the fastest thing, even if it is made from pure unobtanium.
Now you have me getting misty eyed a bit! I remember when my dad had a Cyrix PC and later an AMD K6 II+ whitebox build. :)
At any rate, you do have a point. I can't argue that Nvidia has an inferior product; yet I just wish Nvidia wasn't abusing their position so much.
The K6-2 was a trooper.
I had one of those for what seemed like an eternity.
At first, right out of the gate: I overclocked it from 300MHz to 350MHz just to see what would happen. It worked perfectly without further adjustment (and the next step did not), so I left it right there at 350MHz. For the price, at that time, it kept up great compared to what my peers had going on.
As the years ticked by and it was getting long in the tooth, it stayed around -- but it shifted roles.
I think the last thing it was doing for me was running a $25 SoundBlaster Live! 5.1's EMU10k1 DSP chip under Windows, using the kX audio drivers.
kX let a person use that DSP chip for what it was -- an audio-oriented DSP with some audio-centric IO. With kX, a person could drop basic DSP blocks into the GUI and wire them together arbitrarily, and also wire them into the real world.
I used it as a parametric EQ and active crossover for the stereo in my home office -- unless I was also using it as a bass preamp, in a different mode. Low-latency real-time software DSP was mostly a non-starter at that time, but these functions and routings were all done within the EMU10k1 and end-to-end latency was low enough to play a bass guitar through.
Of course: It still required a computer to run it, and I had a new family at that time and things like the electric bill were very important to me. So I underclocked and undervolted the K6-2 for passive cooling, booted Windows from a CompactFlash card (what spinning HDD?), and hacked the power supply fan to just-barely turn and rotate air over the heatsinks.
It went from a relatively high-cost past-performer to a rather silent low-power rig that I'd remote into over the LAN to wiggle DSP settings on that only had one moving part.
Neat chips, the K6-2 and EMU10k1 were.
Fun times.
(And to bring it all back 'round: We'd be in a different place right now if things like the K6-2 had been more popular than they were. I don't know if it'd be better or worse, but it'd sure be different.)
Dude seriously this is such a nice story. I especially love how you used the EMU10k1 DSP in conjunction with your K6 system to its fullest potential. :D
Speaking of sound cards, I distinctly remember the Sound Blaster Audigy being the very last discrete sound card my dad obtained before we stuck with AC’97, and later the HDA codec audio solution on the motherboard.
I do vaguely recall the kX drivers you mentioned, but I’m pretty sure we stuck with whatever came stock from Creative Labs, for better or for worse. Also… that SB16 emulation under DOS for the Live! and Audigy series cards was not great, having been a carry over from the ENSONIQ days. The fact that I needed EMM386 to use it was a bit of a buzzkill.
On the K6-II+ system we had, we used an AWE64 Gold on the good ol’ ISA bus. Probably my favorite sound card of all time, followed by the Aureal Vortex 2.
Sound cards were cool. Kids these days with their approximately-perfect high-res DACs built into their $12 Apple headphone adapters don't know what it was like. ;)
My mom had a computer with a SoundBlaster 16. I carried that sound card across the room one day for whatever reason a kid does a thing like that, and it got zapped pretty bad with static. It still worked after that, but it learned the strangest new function: It became microphonic. You could shout into the sound card and hear it through the speakers.
But other than being microphonic, the noise wasn't unusual: Sound cards were noisy.
At one point around the turn of the century, I scored a YMF724-based card that featured an ADC stage that actually sounded good, and was quiet. I used this with a FreeBSD box along with a dedicated radio tuner to record some radio shows that I liked. That machine wasn't fast enough to encode decent MP3s in real-time, but it was quick enough to dump PCM audio through a FIFO and onto the hard drive without skipping a beat. MP3 encoding happened later -- asynchronously. It was all scheduled with cron jobs, and with NTP the start times were dead-nuts on. (Sometimes, there'd be 2 or 3 nice'd LAME processes stacked up and running at once. FreeBSD didn't care. It was also routing packets for the multi-link PPP dialup Internet connection at the house, rendering print jobs for a fickle Alps MD-1000 printer, and doing whatever else I tossed at it.)
I used 4front's OSS drivers to get there, which was amusing: IIRC, YMF724 support was an extra-cost item. And I was bothered by this because I'd already paid for it once, for Linux. I complained about that to nobody in particular on IRC, and some rando appeared, asked me what features I wanted for the FreeBSD driver, and they sent me a license file that just worked not two minutes later. "I know the hash they use," they said.
There's a few other memorable cards that I had at various points. I had a CT3670, which was an ISA SoundBlaster with an EMU 8k that had two 30-pin SIMM sockets on it for sample RAM.
There was the Zoltrix Nightingale, which was a CMI8738-based device that was $15 brand new (plus another $12 or something for the optional toslink breakout bracket). The analog bits sounded like crap and it had no bespoke synth or other wizardry, but it had bit-perfect digital IO and a pass-through mode that worked as an SCMS stripper. It was both a wonderful and very shitty sound card, notable mostly because of this contrast.
I've got an Audigy 2 ZS here. I think that may represent the pinnacle of the EMU10k1/10k2 era. (And I'm not an avid gear hoarder, so while I may elect to keep that around forever, it's also likely to be the very last sound card I'll ever own.)
And these days, of course, things are different -- but they're also the same. On my desk at home is a Biamp Tesira. It's a fairly serious rackmount DSP that's meant for conference rooms and convention centers and such, with a dozen balanced inputs and 8 balanced outputs, and this one also has Dante for networked audio. It's got a USB port on it that shows up in Linux as a 2-channel sound card. In practice, it just does the same things that I used the K6-2/EMU10k1/kX machine for: An active crossover, some EQ, and whatever weird DSP creations I feel like doodling up.
But it can do some neat stuff, like: This stereo doesn't have a loudness control, and I decided that it should have something like that. So I had the bot help write a Python script that watches the hardware volume control that I've attached and assigned, computes Fletcher-Munson/ISO 226 equal-loudness curves, and shoves the results into an EQ block in a fashion that is as real-time as the Tesira's rather slow IP control channel will allow.
Holy cow. Again, kudos for the details. This has been a fantastic digression so far lol.
So I do strongly remember Sound Blaster cards, specifically of the SB16 variety, being jokingly referred to as “Noise Blasters” for quite some time, due to the horrible noise floor they had as well as all the hiss. One of the reasons I loved the AWE64 Gold was because Creative did manage to get that well under control by that point, along with other fixes introduced with DSP 4.16. I still have an AWE64 Gold in my collection, complete with the SPDIF bracket, that I will never sell, due to sentimental reasons.
The YMF724 card you mentioned… did that happen to have coaxial SPDIF perchance? I heard that, unlike the SPDIF implementation found on the AWE series cards from Creative, the YMF724 SPDIF carried all audio over it, even under DOS. Not just 44.1 kHz specific sound, which I believe Creative sourced from the EMU8k. Plus, as an added bonus, if your motherboard offered SBLINK (also known as PC/PCI), you could interface with the PCI sound card interrupts directly in DOS without memory-hogging TSRs.
As for my final sound card I ever owned before abandoning them, mine was the rather unique ESI Juli@ back in the 2011/2012 timeframe. I loved how the audio ports had a zany breakout cable for MIDI and RCA features, as well as the board that could flip around for different style jacks.
One other remark that leads to a question. Linux users back in the day had a penchant for choosing one audio API over the other in Linux, like ALSA, OSS, or PulseAudio. Did you play around much with these in the dog days of Linux?
It's been fun so far.
For the YMF724: I really don't remember that part of it, but I'd like to think that if it had SPDIF built out that I really would have paid attention to that detail. The only reason I went through the horrors of using the cheap-at-every-expense CMI8738 Zoltrix card was to get SPDIF to feed an external DAC (and finally live in silence), and if the YMF724 I had included it then my memories would be shaped differently. :)
And I'm usually pretty good with model numbers, but it's possible that this card really didn't have one. Back then, I got a lot of hardware from an amazing shop that sold things in literal white boxes -- stuff that they'd buy in bulk from Taiwan or wherever and stock on the shelves in simple white boxes with a card (in a static bag) inside. No book, no driver disk.
These boxes had a description literally pasted onto them; sometimes black-and-white, and sometimes copied on one of those fancy new color copiers, sometimes with jumper settings if appropriate -- and sometimes without. Some of the parts were name-brand (I bought a Diamond SpeedStar V330 from there with its minty nVidia Riva128 -- that one had a color label), but other times they were approximately as generic as anything could ever be.
Or, I'd pick up stuff even cheaper from the Dayton Hamvention. There were huge quantities of astoundingly-cheap computer parts of questionable origin moving through that show.
But no, no SPDIF on that device that I recall. It may have been on the board as a JST or something, but if it was then I absolutely never used it.
I do remember that bit about the EMU8k's SPDIF output -- my CT3670 had that, too. IIRC it was TTL-level and not galvanically-isolated or protected in any way, on a 2-pin 0.1" header. IIRC, it didn't even have the 75 Ohm terminating resistor that should have been there. I was disappointed by the fact that it only output audio data from the EMU8k, since that part didn't handle PCM audio.
But! There was a software project way back then that abused the EMU8k to do it anyway: Load up the sample RAM with some PCM, and play it. Repeat over and over again with just the right timing (loading samples in advance, and clearing the ones that have been used), give it a device name, and bingo-bango: A person can play a high-latency MP3 over SPDIF on their SoundBlaster AWE-equivalent.
I was never able to make it work, but I sure did admire the hack value. :)
That ESI Juli@ is an a very clever bit of kit and I've not ever seen one before. I'm a bit in awe of the flexibility of it; the flip-card business is brilliant. There's got to be applications where that kind of thing could be used in the resurgent analog synth world.
It's very different, but for some reason it reminds me of the Lexicon Core 2 we used in a studio from 1999 until 2002 or so. This had its (sadly unbalanced) 4 inputs and 8 outputs on an external breakout box, and we gave it another 8 channels in each direction by plugging it into an ADAT machine. That was an odd configuration, and bouncing through the hardware reverb on the card was even odder.
The Core 2 did not work with the then-cutting-edge Athlon box we built for it and that was a real bummer -- we spent a lot of money on that rig, and I spent a ton of time troubleshooting it before giving up. (We then spent a lot more money replacing that board with a slotted Pentium 3.)
ALSA, OSS, PulseAudio: Yeah, all of those. I paid for OSS fairly early on, and that was also always very simple to make work -- and it did work great as long as a person only did one thing at a time. I really enjoyed the flexibility of ALSA -- it let me plug in things like software mixers, so I could hear a "ding" while I was playing an MP3. And I liked the network transparency of PulseAudio ("it's kind of like X, but for sound!") but nobody else really seemed interested in that aspect around that time.
If I had to pick just one as a favorite, it would definitely be OSS: The concept of one sound card with exactly one program that completely owned the hardware until it was done with it allowed for some very precise dealings, just like with MS-DOS. It felt familiar, plain, and robust.
You?
I certainly have no delusions of Nvidia going bankrupt. In fact, they will certainly make it to the other side without much issue. That said, I do foresee Nvidia taking a reputational hit, with AMD and (possibly) Intel gaining more mindshare among consumers.
AMD will eat their lunch, and are. If AMD can properly replace CUDA, rtx hdr, they could win the market.