This topic is fascinating to me. The Toy Story film workflow is a perfect illustration of intentional compensation: artists pushed greens in the digital master because 35 mm film would darken and desaturate them. The aim was never neon greens on screen, it was colour calibration for a later step. Only later, when digital masters were reused without the film stage, did those compensating choices start to look like creative ones.
I run into this same failure mode often. We introduce purposeful scaffolding in the workflow that isn’t meant to stand alone, but exists solely to ensure the final output behaves as intended. Months later, someone is pitching how we should “lean into the bold saturated greens,” not realising the topic only exists because we specifically wanted neutral greens in the final output. The scaffold becomes the building.
In our work this kind of nuance isn’t optional, it is the project. If we lose track of which decisions are compensations and which are targets, outcomes drift badly and quietly, and everything built after is optimised for the wrong goal.
I’d genuinely value advice on preventing this. Is there a good name or framework for this pattern? Something concise that distinguishes a process artefact from product intent, and helps teams course-correct early without sounding like a semantics debate?
There’s an analog analogue: mixing and mastering audio recordings for the devices of the era.
I first heard about this when reading an article or book about Jimi Hendrix making choices based on what the output sounded like on AM radio. Contrast that with the contemporary recordings of The Beatles, in which George Martin was oriented toward what sounded best in the studio and home hi-fi (which was pretty amazing if you could afford decent German and Japanese components).
Even today, after digital transfers and remasters and high end speakers and headphones, Hendrix’s late 60s studio recordings don’t hold a candle anything the Beatles did from Revolver on.
> There’s an analog analogue: mixing and mastering audio recordings for the devices of the era.
In the modern day, this has one extremely noticeable effect: audio releases used to assume that you were going to play your music on a big, expensive stereo system, and they tried to create the illusion of the different members of the band standing in different places.
But today you listen to music on headphones, and it's very weird to have, for example, the bassline playing in one ear while the rest of the music plays in your other ear.
I know you're looking for something more universal, but in modern video workflows you'd apply a chain of color transformations on top the final composited image to compensate the display you're working with.
So I guess try separating your compensations from the original work and create a workflow that automatically applies them
Theory: Everything is built on barely functioning ruins with each successive generation or layer mostly unaware of the proper ways to use anything produced previously. Ten steps forward and nine steps back. All progress has always been like this.
In Toy Story's case, the digital master should have had "correct" colors, and the tweaking done in the transfer to film step. It's the responsibility of the transfer process to make sure that the colors are right.
Now, counter arguments could be that the animators needed to work with awareness of how film changes things; or that animators (in the hand-painted era) always had to adjust colors slightly.
---
I think the real issue is that Disney should know enough to tweak the colors of the digital releases to match what the artists intended.
Not scaffolding in the same way, but, two examples of "fetishizing accidental properties of physical artworks that the original artists might have considered undesirable degradations" are
- the fashion for unpainted marble statues and architecture
- the aesthetic of running film slightly too fast in the projector (or slightly too slow in the camera) for an old-timey effect
Isn't the entire point of "reinventing the wheel" to address this exact problem?
This is one of the tradeoffs of maintaining backwards compatibility and stewardship -- you are required to keep track of each "cause" of that backwards compatibility. And since the number of "causes" can quickly become enumerable, that's usually what prompts people to reinvent the wheel.
And when I say reinvent the wheel, I am NOT describing what is effectively a software port. I am talking about going back to ground zero, and building the framework from the ground up, considering ONLY the needs of the task at hand. It's the most effective way to prune these needless requirements.
It seems pretty common in software - engineers not following the spec. Another thing that happens is the pivot. You realize the scaffolding is what everyone wants and sell that instead. The scaffold becomes the building and also product.
That’s a great observation. I’m hitting the same thing… yesterday’s hacks are today’s gospel.
My solution is decision documents. I write down the business problem, background on how we got here, my recommended solution, alternative solutions with discussion about their relative strengths and weaknesses, and finally and executive summary that states the whole affirmative recommendation in half a page.
Then I send that doc to the business owners to review and critique. I meet with them and chase down ground truth. Yes it works like this NOW but what SHOULD it be?
We iterate until everyone is excited about the revision, then we implement.
There are two observations I've seen in practice with decision documents: the first is that people want to consume the bare minimum before getting started, so such docs have to be very carefully written to surface the most important decision(s) early, or otherwise call them out for quick access. This often gets lost as word count grows and becomes a metric.
The second is that excitement typically falls with each iteration, even while everyone agrees that each is better than the previous. Excitement follows more strongly from newness than rightness.
Eventually you'll run into a decision that was made for one set of reasons but succeeded for completely different reasons. A decision document can't help there; it can only tell you why the decision was made.
That is the nature of evolutionary processes and it's the reason people (and animals; you can find plenty of work on e.g. "superstition in chickens") are reluctant to change working systems.
There's a similar issue with retro video games and emulators: the screens on the original devices often had low color saturation, so the RGB data in those games were very saturated to compensate. Then people took the ROMs to use in emulators with modern screens, and the colors are over-saturated or just off. That's why you often see screenshots of retro games with ridiculously bright colors. Thankfully now many emulators implement filters to reproduce colors closer to the original look.
With the GBA, the original GBA screen and the first gen GBA SP had very washed out colors and not saturated at all. The Mario ports to the GBA looked doubly since they desaturated their colors and were shown on a desaturated screen. I've heard that the real reason the colors were desaturated was because the first GBA model didn't have a backlight so the colors were lightened to be more visible, but I'm not quite sure that's the case. Lots of other games didn't do that.
And with the second version of the GBA SP and the GB Micro, colors were very saturated. Particularly on the SP. If anything, cranking up the saturation on an emulator would get you closer to how things looked on those models, while heavily desaturating would get you closer to the look on earlier models.
Ah yes, we often get folks in the nesdev community bickering over which "NES Palette" (sourced from their favorite emulator) is the "best" one. The reality is extraordinarily complicated and I'm barely qualified to explain it:
In addition to CRTs having variable properties, it turns out a lot of consoles (understandably!) cheat a little bit when generating a composite signal. The PPU's voltages are slightly out of spec, its timing is weird to work around a color artifact issue, and it generates a square wave for the chroma carrier rather than an ideal sine wave, which produces even more fun problems near the edges. So we've got all of that going on, and then the varying properties of how each TV chooses to interpret the signal. Then we throw electrons at phosphors and the pesky real world and human perception gets involved... it's a real mess!
Aha! I used to work in film and was very close to the film scanning system.
When you scan in a film you need to dust bust it, and generally clean it up (because there are physical scars on the film from going through the projector. Theres also a shit tone of dust, that needs to be physically or digitally removed, ie "busted")
If you're unluckly you'll use a telecine machine, https://www.ebay.co.uk/itm/283479247780 which runs much faster, but has less time to dustbust and properly register the film (so it'll warp more)
However! that doesnt affect the colour. Those colour changes are deliberate and are a result of grading. Ie, a colourist has gone through and made changes to make each scene feel more effective. Ideally they'd alter the colour for emotion, but that depends on who's making the decision.
That has been something I've wondered about since seeing frame comparisons of (probably) telecine'ed prints of The Matrix vs. the myriad home video releases.
I'm a colorist and it absolutely does effect color. Every telecine is different and will create a different looking scan. Telecine operators will do a one light pass to try and compensate but any scan needs to be adjusted to achieve what the artist's original vision was.
Someone correct me if I'm wrong, but I believe it builds a static charge as it runs through the projector and attracts dust. I say this because I remember holding my hand near moving film in our (home) movie projector, and as a kid enjoying feeling the hairs on my arm standing up from the static. Maybe professional gear protects against that somehow, but if not that'd be why.
It's a surprisingly common error where someone picks up an old 35mm print and assumes it is somehow canonical... Besides whatever the provenance of these prints are (this gets complicated) the reality is that these were also made to look at best as they could for typical movie theater projector systems in the 90s. These bulbs were hot and bright and there were many other considerations around what the final picture would look like on the screen. So yeah, if you digitize 35mm film today, it will look different, and different from how its ever been been displayed in a movie theater.
Agreed. It's a fine article but leaves half the story on the table. It is supposedly comparing what these movies looked like in the theater to the modern streaming and bluray versions, but is actually comparing what a film scan (scanner settings unspecified) projected on a TV (or other unspecified screen) looks like compared to the digital versions on (presumably) the same screen. And then we can ask: how were the comparison images captured, rendered to jpeg for the web, before we the readers view them on our own screens? I'm not arguing Catmull and company didn't do a great job of rendering to film, but this comparison doesn't necessarily tell us anything.
Don't believe me? Download the comparison pictures in the article to your device and play with filters and settings. You can get almost anything you want and the same was true at every step in the render pipeline to your TV.
Ps - and don't get me started on how my 60-year old eyes see color to what they perceived when I saw this in the theater
It’s an interesting and valid point that the projectors from the time would mean current scans of 35mm will be different too. However, taking for example the Aladdin screenshot in particular, the sky is COMPLETELY the wrong colour in the modern digital edition, so it seems to me at least that these 35mm scans whilst not perfect to the 90’s are closer to correct than their digital counterparts.
And as someone who is part of those conservation communities that scan 35mm with donations to keep the existing look, a lot of the people doing those projects are aware of this. They do some color adjustment to compensate for print fading, for the type of bulb that were used in movie theatres back then (using a LUT), etc...
I do find that often enough commercial releases like Aladdin or other movies like Terminator 2 are done lazily and have completely different colors than what was historically shown. I think part of this is the fact that studios don't necessarily recognise the importance of that legacy and don't want to spend money on it.
There was similar outrage (if that's the right word) about a Matrix remaster that either added or removed a green color filter, and there's several other examples where they did a Thing with colour grading / filtering in a remaster.
To me, that just looks like what happens when I try to play HDR content on a system that doesn't know about HDR. (It looks like you're watching it through sunglasses.)
See my top level comment for more info on this, but the Aladdin scan used in the article was from a 35mm trailer that's been scanned on an unknown scanner, and had unknown processing applied to it. It's not really possible to compare anything other than resolution and artefacts in the two images.
And it was made by a lab that made choices on processing and developing times, that can quite easily affect the resulting image. You hope that labs are reasonably standard across the board and calibrate frequently, but even processing two copies of the same material in a lab, one after the other will result in images that look different if projected side by side. This is why it's probably impossible to made new prints of 3-strip-cinerama films now, the knowledge and number of labs that can do this are near zero.
This reminds me of how pre-LCD console games don't look as intended on modern displays, or how vinyl sounds different from CDs because mixing and mastering targeted physical media with limitations.
Wasn't CD more so cheapening out? Doing work one time and mostly for radio where perceived listening scenario was car or background and thus less dynamic range allowed it be louder on average.
CD itself can replicate same dynamic range and more, but well that doesn't sell extra copies.
The loudness war was a thing in all media. In the 80s most of us didn't have CD players but our vinyl and tapes of pop and rock were all recorded overly loud. Compared to the classical and jazz recordings, or perhaps the heyday of audiophile 70s rock, it was annoying and sad.
> It's a surprisingly common error where someone picks up an old 35mm print and assumes it is somehow canonical
Same applies for people buying modern vinyl records believing them to be more authentic than a CD or (god-forbid) online streaming.
Everything comes from a digital master, and arguably the vinyl copy adds artefacts and colour to the sound that is not part of the original recording. Additionally, the vinyl is not catching more overtones because it's analogue, there is no true analogue path in modern music any more.
I don't know if this is still true, but I know that in the 2000s the vinyls usually were mastered better than the CDs. There even was a website comparing CD vs vinyl releases, where the person hosting it was lamenting this fact because objectively CDs have a much higher dynamic range than vinyls, although I can't find it now. CDs were a victim of the loudness war[0].
Allegedly, for a lot of music that is old enough the best version to get (if you have the kind of hifi system that can make use of it) is an early 80s CD release, because it sits in a sweet spot of predating the loudness war where producers actually using the dynamic range of the CD.
The loudness wars were mostly an artifact of the 90s-2010s, because consumers were listening on horrible plasticky iPod earbuds or cheap Logitech speakers and the music had to sound good on those.
Once better monitors became more commonplace, mastering became dynamic again.
This is most clear with Metallica's Death Magnetic, which is a brickwalled monstrosity on the 2008 release but was fixed on the 2015 release[0]. And you can see this all over, where albums from the 90s had a 2000s "10-year anniversary" remaster that is heavily compressed, but then a 2010s or 2020s remaster that is dynamic again.
[0] Interestingly enough between those dates, fans extracted the non-brickwalled Guitar Hero tracks and mastered them as well as they could. Fun times :).
I dunno about authentic but for a while (as another commenter pointed out) they didn't have the loudness maxed out and / or had better dynamic range. That said, music quality aside, vinyls have IMO better collectability value than CDs. They feel less fragile, much more space for artwork and extras, etc.
I think the entire premise of the article should be challenged. Not only is 35mm not meant to be canonical, but the 35mm scans the author presented are not what we saw, at least for Aladdin.
I've watched Aladdin more than any as a child and the Blu-ray screenshot is much more familiar to me than the 35mm scan. Aladdin always had the velvia look.
> Early home releases were based on those 35 mm versions.
Famously CRT TVs didn't show as much magenta so in the 90s home VHS releases compensated by cranking up the magenta so that it would be shown correctly on the TVs of the time. It was a documented practice at the time.
So, yes the VHS is expected to have more magenta.
Anecdotally, I remember watching Aladdin at the movie theatre when it came out and later on TV multiple times and the VHS you saw doesn't correspond to my memories at all.
For sure, the author simplified things for the article. Anyway, in the case of VHS, they were indeed based on the 35mm scan but then had additional magenta added (as well as pan and scan to change the aspect ratio).
The author is not wrong that oversaturation is a source transfer phenomena (which will always be different unless special care is taken to compare with the source material).
On most TVs that magenta wouldn't have shown as much as the youtube video shows because TVs tended to have weaker magentas. Of course, it's not like TVs were that uniformly calibrated back then and there were variations between TVs. So depending on the TV you had, it might have ended up having too much magenta but that would have usually been with more expensive and more accurate TVs.
TLDR: Transfers are hard, any link in the chain can be not properly calibrated, historically some people in charge of transferring from one source to another compensated for perceived weak links in the chain.
The magenta thing is interesting. I learned something new. Reading the other comments, this is seems to be as much a tale of color calibration as much as anything.
Regarding my memory, it becomes shakier the more I think about it. I do remember the purples but me having watched the cartoon could have affected that.
It sounds like in the case of Toy Story, the Pixar team were working toward a 35mm print as the final product, so that probably should be considered canonical: it's what the creative team set out to make.
This makes so much more sense now. After having kids I've been watching my fair share of Pixar and I just never recalled how flat and bland everything looked but I would always chalk it up to my brain not recalling how it looked at the time. Good to know I guess that it wasn't just entirely nostalgia but sad that we continue to lose some of this history and so soon.
Yeah I clicked this link going “oh god it’s because they printed to film, I bet, and man do I hope it looks worse so I don’t have to hunt down a bunch of giant 35mm scans of even more movies that can’t be seen properly any other way”
But no, of course it looks between slightly and way better in every case. Goddamnit. Pour one out for my overworked disk array.
And here I was thinking it was just my imagination that several of these look kinda shitty on Blu-ray and stream rips. Nope, they really are worse.
Piracy: saving our childhoods one frame at a time.
I'm not sure why you're getting downvoted. What you're hinting at is that a lot of original 35mms are now getting scanned and uploaded privately, especially where all the commercial releases on Blu-ray and streaming are based on modified versions of the original movies, or over-restored versions.
These can be especially hard to find as the files are typically enormous, with low compression to keep things like grain. I see them mostly traded on short-lived gdrives and Telegram.
> I see them mostly traded on short-lived gdrives and Telegram.
Someone tell this community to share over BT. Aint nobody got time to keep up with which platform/server everyone is on and which links are expired and yuck.
The main reason they are not shared as widely is that there's a bit of conflict within the community between those that really want to stay under the radar and not risk being targeted by copyright owners (and so try to keep things very much private between the donors who funded the 600-900 usd cost of the scans) and those who want to open up a bit more and so use telegram, reddit and upload to private trackers.
One’s an accurate recording of how a real thing looked.
The other’s fake noise.
One’s a real photo from 1890. The other’s an old-timey instagram filter.
It makes sense that some folks might care about the difference. Like, I love my old family Polaroids. I would not want a scanned version of those to have the “noise” removed for compression’s sake. If that had been done, I’d have limited interest in adding fake noise back to them. By far my favorite version to have would be the originals, without the “noise” smoothed out at all.
Lots of folks have similar feelings about film. Faked grain isn’t what they’re after, at all. It’s practically unrelated to what they’re looking for.
> One’s an accurate recording of how a real thing looked.
> The other’s fake noise
But since there is no such thing as the real thing, it could just as well match one of the many real noise patterns in one of the many real things floating around, or a real thing at a different point in time with more/less degradation. And you wouldn't even know the difference, thus...
> It makes sense that some folks might care about the difference
Not really, it doesn't make sense to care about identical noise you can't tell apart. Of course, plenty people care about all kind of nonsense, so that won't stop those folks, but let's not pretentd there is some 'real physics' involved
I think you missed the "a" vs " the", you can encode different sources that would have different grains, or the same source would have different grain at different times.
But also a simulation called compression of a real thing is different from that real thing, so that purity test had already been failed
Well film grain doesn't matter because compression exists, apparently, and may as well be simulated because it's already failed the "purity test" and may as well be algo-noise. That holds for everything else! May as well maximize the compression and simulate all of it then.
[EDIT] My point is "film grain's not more-real than algo noise" is simply not true, at all. An attempt to represent something with fidelity is not the same thing as giving up and faking it entirely based on a guess with zero connection to the real thing—its being a representation and not the actual-real-thing doesn't render it equally as "impure" as a noise-adding filter.
You're still dancing there in the slop, hallucinating the arguments thinking it's a pretty dress!
It may as well be stimulated because you won't see the difference! So now you've imagined some purity test which was never true, so you have nothing and start hallucinating some hyperbolic AI thing
You can’t trust corporations to respect or protect art. You can’t even buy or screen the original theatrical release of Star Wars. The only option is as you say. There are many more examples of the owners of IP altering it in subsequence editions/iterations. This still seems so insane to me that it’s not even for sale anywhere…
I don't understand why you're getting downvoted. So many beautiful things have been lost to perpetual IP, e.g. old games that could be easily ported by volunteers given source code, which can never be monetised again.
Sometimes people create things that surpass them, and I think it is totally fair for them to belong to humanity after the people that created them generated enough money for their efforts.
"toy story film scan" on Kagi led me to a reddit page that may or may not contain links that might help you, but don't dawdle those links may not work forever.
Another one that's been hard to find is the 4k matrix original color grading release. Ping me if you have it! (Not the 1080p release)
I'm surprised they can't just put a filter on the digital versions to achieve a similar look and feel to the 35mm version.
It is clear that the animators factored in the colour changes from the original media to 35mm, so it seems a disservice to them to re-release their works without honouring how they intended the films to be seen.
They could, but it would require some work to get it right. This is very similar to conversations that happen regularly in the retro game scene regarding CRT monitors vs modern monitors for games of a certain era. The analog process was absolutely factored in when the art was being made, so if you want a similar visuals on a modern screen you will need some level of thoughtful post processing.
They could reduce the saturation with 1 mouse click if they wanted, but they didn't. They must have intentionally decided that high saturation is desirable.
I’m reminded of the beginning of the movie Elf, where the book publisher is informed that a printing error means their latest book is missing the final two pages. Should they pulp and reprint? He says,
> You think a kid is going to notice two pages? All they do is look at the pictures.
I’m quite sure bean counters look at Disney kids movies the exact same way, despite them being Disney’s bread and butter.
With Star Wars you have a dedicated adult fan base that’ll buy up remasters and reworkings. Aladdin? Not so much. Especially in the streaming era, no one is even buying any individual movie any more.
I'm a 39 year old man who ground his VHS of Aladdin to dust in the 90s, and bought the Blu Ray because I can't say I can rely on streaming to always exist.
> With Star Wars you have a dedicated adult fan base that’ll buy up remasters and reworkings. Aladdin? Not so much. Especially in the streaming era, no one is even buying any individual movie any more.
I agree it was likely Disney being cheap, but there are tons of people who'll buy up disney movies on physical media in the age of streaming. Not only are there disney fans who'd rival the obsessiveness of star wars fans, but like Lucas Disney just can't leave shit alone. They go back and censor stuff all the time and you can't get the uncensored versions on their streaming platform. Aladdin is even an example where they've made changes. It's not even a new thing for Disney. The lyrics to one of the songs in Aladdin were changed long before Disney+ existed.
The vast majority of people will not care nor even notice. Some people will notice and say, hey, why is it "blurry." So do you spend a good chunk of time and money to make it look accurate or do you just dump the file onto the server and call it a day?
To speak nothing of the global audience for these films. I'm guessing most people's first experience seeing these movies was off a VHS or DVD, so the nostalgia factor is only relevant to small percentage of viewers, and only a small percentage of that percentage notices.
VHS resolution is total crap... yet: it's not uncommon for the colors and contrast on VHS (and some early DVD) to be much better than what is available for streaming today.
This is totally bonkers, because the VHS format is crippled, also color wise. Many modern transfers are just crap.
It’s really amazing how some Blu-ray do in fact manage to be net-worse than early dvd or even vhs, but it’s true.
An infamous case is the Buffy the Vampire Slayer tv show. The Blu-ray (edit: and streaming copies) went back to the film source, which is good, but… that meant losing the color grading and digital effects, because the final show wasn’t printed to film. Not only did they get lazy recreating the effects, they don’t seem to have done scene-by-scene color grading at all. This radically alters the color-mood of many scenes, but worse, it harms the legibility of the show, because lots of scenes were shot day-for-night and fixed in post, but now those just look like they’re daytime, so it’s often hard to tell when a scene is supposed to be taking place, which matters a lot in any show or film but kinda extra-matters in one with fucking vampires.
The result is that even a recorded-from-broadcast VHS is arguably far superior to the blu ray for its colors, which is an astounding level of failure.
(There are other problems with things like some kind of ill-advised auto-cropping seeming to have been applied and turning some wide shots into close-ups, removing context the viewer is intended to have and making scenes confusing, but the colors alone are such a failure that a poor VHS broadcast recording is still arguably better just on those grounds)
How can we get away from this mindset as a society, where craft and art are sacrificed at the altar of "it's not monetarily worth it."
There's a fucking lot of things that are not worth it monetarily, but worth it for the sake of itself. Because it's a nice gesture. Or because it just makes people happy. Not to sound like some hippie idealist, but it's just so frustrating that everything has to be commoditized.
Centuries is stretching it. It’s central to industrialisation, Taylor, Ford, etc. The relentless pursuit of efficiency and technique. Its anti-thesis is art for art’s sake.
In modern tech circles, the utilitarian mindset is going strong, now that the hacker ethos is dead and it’s all about being corporate friendly and hireable.
Yeah the industrialised world wasn't maligned by Blake as 'dark Satanic mills' or as Mordor by Tolkien because they found it an artistically fulfilling place.
> How can we get away from this mindset as a society, where craft and art are sacrificed at the altar of "it's not monetarily worth it."
Honestly, by weakening copyright protections. People who love the works will do the work to protect them when they don't have to fear being sued into bankruptcy for trying to preserve their own culture.
You can sit down and recolor the movie frame by frame and release it on torrent yourself, it'll make many people happy. It won't be worth it monetarily but since you're annoyed it doesn't exist and money isn't a factor...
It's always easy to complain about others not being generous enough with their time, but we always have an excuse for why we won't do it ourselves.
You can't do that since besides time you also need knowledge/skill. So the final difference could be between "an extra 1% of the budget" at a corporate level vs "and extra 10% of your life to become a professional and fix a video, and also break the law in the process".
Pretty easy to see how it's not just "an excuse", but a bit more fundamental issue
I'm this particular instance though it's not really about time, it's studios not wanting to pay what I imagine would be a relatively small amount to do the conversion. It's not going to be a frame-by-frame laborious process.
> You can sit down and recolor the movie frame by frame and release it on torrent yourself, it'll make many people happy.
You can't, at least not if you want an acceptable result.
In photography, if you have a JPEG photo only, you can't do post-facto adjustments of the white balance, for that you need RAW - too much information has been lost during compression.
For movies it's just the same. To achieve something that actually looks good with a LUT (that's the fancy way for re-coloring, aka color grading), you need access to the uncompressed scans, as early in the processing pipeline as you can get (i.e. before any kind of filter is applied).
Disney do pay for industry leading colorists. They chose to favour a more saturated look for Aladdin et al. It is reasonable to prefer either.
I can't imaging what happened to the greens in the Toy Story examples if they are accurate.
And ultimately, what you need to achieve acceptable CRT effects is resolution. Only now, with 4K and above, can we start to portray the complex interactions between the electron beam and the produced image by your console. But the colour banding that caused the hearts of The Legend of Zelda to show a golden sheen is still unreachable.
It's not just about slapping on some grain and calling it a day; it's about honoring a whole set of artistic decisions that were made with that specific medium in mind
You can, that's what Pixar did while creating the film. From the article:
> During production, we’re working mostly from computer monitors. We’re rarely seeing the images on film. So, we have five or six extremely high-resolution monitors that have better color and picture quality. We put those in general work areas, so people can go and see how their work looks. Then, when we record, we try to calibrate to the film stock, so the image we have on the monitor looks the same as what we’ll get on film.
But they didn't do a perfect job (the behavior of film is extremely complex), so there's a question- should the digital release reflect their intention as they were targeting these calibrated monitors or should it reflect what was actually released? Also, this wouldn't include other artifacts like film grain.
> Then, when we record, we try to calibrate to the film stock, so the image we have on the monitor
Except, as they say, the high grade monitors were calibrated to emulate the characteristics of film.
If we can show that D+ doesn't look like the film, then we can point out that it probably doesn't look like the calibrated monitors either. Those army men are not that shade of slime green in real life, and you'll have a hard time convincing me that after all the thought and effort went in to the animation they allowed that putrid pea shade to go through.
The best option for me would be to release it in whatever format preserves the most of the original colour data without modification, then let the viewer application apply colour grading. Give me the raw renders in a linear 16bpc colour space with no changes. Sadly, I don't think we have digital movie formats that can handle that.
It is doable and you can get presets designed to mimic the look of legendary photography film stock like Velvia. But what they did back then was very much an analog process and thus also inherently unstable. Small details start to matter in terms of exposure times, projectors used etc. There’s so many frames and it took so much time, that it’s almost guaranteed there’d be noticeable differences due to process fluctuations.
I've grown very fond of having shaders available for my retro games.
I suspect having shader plugins for TV and movie watching will become a thing.
"The input is supposed to be 24 FPS, so please find those frames from the input signal. Use AI to try to remove compression artifacts. Regrade digital for Kodak 35mm film. Then, flash each frame twice, with blackness in-between to emulate how movie theaters would project each frame twice. Moderate denoise. Add film grain."
I don't actually know what kind of filters I'd want, but I expect some people will have very strong opinions about the best way to watch given movies. I imagine browsing settings, like browsing user-contributed Controller settings on Steam Deck...
Neat! The Youtube channel Noodle recently did a related deep dive into the differences in the releases of The Matrix [0]. The back half of the video also touches on the art of transferring from film/video to digital.
I always felt the old matrix had a more colder blue. And it changed drastically when the second and third hit cinemas. At least that was my memory because I watched a double feature when the second one hit the theatre's and complained then that the Matrix somehow looked weird.
But it could also be my memory since I also own the blue ray release.
Another movie with the same / similar problem is the DVD release of the Lord of the Rings Extended editions. Both Blu-ray and 4K version. As far as I remember is that they fixed it for the theatrical version in 4K but not extended.
It's wild to realize that the version of the movie most of us remember emotionally is not the one that's currently streaming. There's something bittersweet about that... like your memory has a certain warmth and texture that modern restorations just can't quite recreate.
Movies projected on film look different not only because of the color and texture, but also a constant spatial jitter over time. When the film moves through the projector, each frame locks into a slightly different position vertically. That creates a wobble that's called "film weave."
(If you want to create truly authentic-looking titles for a 1980s B-grade sci-fi movie, don't forget to add that vertical wobble to your Eurostile Extended Bold layout that reads: "THE YEAR IS 2025...")
The one frame they showed from the Lion King really stood out. The difference in how the background animals were washed out by the sunlight makes the film version look significantly better.
I'm not sure if I'm just young enough to be on the other side of this despite seeing all three of those Disney movies as a millennial kid (Lion King and Aladdin were VHS mainstays in my house, and I remember seeing Mulan in theaters), but I honestly don't find the film grain to look better at all and think all three of those bottom images are much more appealing. For the Toy Story ones, I think I'm mostly indifferent; I can see why some people might prefer the upper film images but don't really think I'd notice which one I was watching. I'd definitely think I'd notice the difference in the 2D animation though and would find the film grain extremely distracting.
To me it's much worse. You can't see all of the detail the artists drew, and there is noise everywhere, even specs of dust.catches. Whenever I watch a film based movie my immersion always gets broken by all the little specs that show up. Digital is a much more immersive experience for me.
> To me it's much worse. You can't see all of the detail the artists drew, and there is noise everywhere, even specs of dust.catches.
In the lion king example you weren't meant to see all of the detail the artists drew. In the Army men example the color on the digital version is nothing like the color of the actual toys.
They originally made those movies the way they did intentionally because what they wanted wasn't crystal clear images with unrealistic colors, they wanted atmosphere and for things to look realistic.
Film grain and dust can be excessive and distracting. It's a good thing when artifacts added due to dirt/age gets cleaned up for transfers so we can have clear images, but the result of that clean up should still show what the artists originally intended and that's where disney's digital versions really miss the mark.
This is an interesting take when you look at the gas station Toy Story example and consider the night sky. In the digital version the stars are very washed out but in the film version the sky is dark and it's easy to appreciate the stars. Perhaps it's unrealistic when you realize the setting is beneath a gas station canopy with fluorescent lights, but that detail, along with some of the very distinct coloring, stuck out to me.
Which is of course highly subjective; you could argue that film grain is an unwanted but unavoidable side-effect from the medium used, just like other artifacts from film - vertical alignment issues, colour shifting from "film breath", 24 frames per second, or the weird sped-up look from really old films.
I don't believe these were part of the filmmaker's vision at the time, but unavoidable. Nowadays they are added again to films (and video games) on purpose to create a certain (nostalgic) effect.
If you plug a Nintendo system's RCA cables into a modern TV, it will look like garbage. Emulated games on LCDs look pixelated.
Those games were designed for a CRT's pixel grid. They don't look right on LCDs, and the upscalers in home theater equipment don't respect that. There are hardware upscalers and software shaders that are specifically designed to replicate a CRT's quirks, to let you better approximate how those games were designed to be played.
Related - someone recently built a CRT dock for his Switch, so he could play Nintendo Switch Online's emulated games as originally intended:
I'm stunned so many people here can remember details as fine as the colour grading of a film. I couldn't remember specifics like that from 6 months ago, let alone 30 years ago when I was a child and wouldn't have had the thought to watch for cinematographic touches.
Side node - I wonder if it's a millenial thing that our memories are worse due to modern technology, or perhaps we are more aware of false memories due to the sheer availability of information like this blog post.
At least for me it's not so much details like color grading over the entire film, it's more like a specific scene got burned into memory. Movie looks pretty much fine until reaching that scene and it's so different it's like something got shaken loose and you start to see the larger issues.
For an example people here might be more familiar with, it's like how you can't even see bad kerning until you learn what it is, then start to notice it a lot more.
I am not a huge gamer - maybe a dozen hours a year. But I feel that, say, Mario responds differently to controls in an emulator than how I remember Mario responding on an NES with a CRT.
But I was never very good, and it has been decades, so I don't know how much of this is just poor memory - I actually don't think I'm good enough/play enough that the latency of modern input/displays makes a difference at my level.
I would love to try both side-by-side to see if I could pick out the difference in latency/responsiveness.
I doubt many people 'remember' this to any significant extent, but there are probably many cases of media giving the 'wrong' vibe with a new release, and you just assume it's because you've gotten older, but then when you get access to the original you experienced, the 'good' vibe is back, and you can easily compare between the two.
Although some people do infact remember the differences, but I'd guess a lot of those incidents are caused by people experiencing them in fairly quick succession. It's one thing to remember the difference between a DVD 20 years ago and a blu-ray you only watched today, and another to watch a DVD 15 years ago and a blu-ray 14 years ago.
Different people just remember different things. I bet most people don't remember either and only going "ah yes of course!" after reading this blogpost (which means they didn't remember at all).
Anecdata here, but I played Zelda Ocarina of Time on CRT when I was a child, and have since replayed it many times via emulator. The game never looked quite as good as I remembered it, but of course I chalked it up to the fact that graphics have improved by several orders of magnitude since then.
Then a few years ago I was throwing out my parent's old CRT and decided to plug in the N64 one last time. Holy crap was it like night and day. It looked exactly as I remembered it, so much more mysterious and properly blended than it does on an LCD screen.
I don't see why the same wouldn't apply to films, sometimes our memories aren't false.
> He [David DiFrancesco] broke ground in film printing — specifically, in putting digital images on analog film.
> Their system was fairly straightforward. Every frame of Toy Story’s negative was exposed, three times, in front of a CRT screen that displayed the movie.
While I have no doubt that this hadn't been done at the scale and resolution, it struck me that I'd heard about this concept in a podcast episode [1] in which very early (1964) computer animation was discussed alongside the SC4020 microfilm printer that used a Charactron CRT which could display text for exposure to film or plot lines.
Generally yes, but we're still working on it all these years later! This article by Chris Brejon offers a very in-depth look into the differences brought about by different display transforms: https://chrisbrejon.com/articles/ocio-display-transforms-and...
The "best" right now, in my opinion, is AgX, which at this point has various "flavours" that operate slightly differently. You can find a nice comparison of OCIO configs here: https://liamcollod.xyz/picture-lab-lxm/CAlc-D8T-dragon
Playing them, handling them, and poor storage all degrade them. Most original prints will have been played many times, and often haven’t been consistently stored well.
The 4k77 et c. fan scans of the original Star Wars trilogy, which aimed to get as close as possible to what one would have seen in a theater the year of release, used multiple prints to fill in e.g. bad frames, used references like (I think) magazine prints of stills and well-preserved fragments or individual frames to fix the (always faded, sometimes badly) color grading and contrast and such, and had to extensively hand-correct things like scratches, with some reels or parts of reels requiring a lot more of that kind of work than others. Even Jedi required a lot of that sort of work, and those reels would have been only something like 30-35 years old when they started working on them.
Hollywood does store original prints in underground salt mines (at least I am aware of a place in Kansas where they do this). Of course who knows where the frames we are being shown from the 35mm film version are coming from. Likely not these copies that are probably still in halite storage.
So it's fascinating reading this looking at the screengrabs of the "original" versions... not so much because they are "how I remember them" but indeed, because they have a certain nostalgic quality I can't quite name - they "look old". Presumably this is because, back in the day, when I was watching these films on VHS tapes, they had come to tape from 35mm film. I fear I will never again be able to look at "old looking" footage with the same nostalgia again, now that I understand why it looks that way - and, indeed, that it isn't supposed to look that way!
Beauty and the Beast on Bluray looks completely different from what I remember; I had assumed that they had just regraded it, but given that it was developed with CAPS, maybe this is part of the effect?
> "Even so, it’s a little disquieting to think that Toy Story, the film that built our current world, is barely available in the form that wowed audiences of the ‘90s."
Load it up in DaVinci Resolve, knock the saturation and green curve down a bit, and boom, it looks like the film print.
Or you could slap a film-look LUT on, but you don't need to go that far.
What an excellent piece! I thoroughly enjoyed reading it, brought my childhood memories flooding back. I have so many fond recollections of that 90s era, including "A Bug's Life." I remember gathering with my cousins at my grandmother's house to watch these films on VHS. Time flies.
It reminds me also of the 24 FPS discussion, which is still the standard as far as I know for cinema, even though 48 or 60 FPS are pretty standard for series, The 24 FPS give it a more cinematic feeling.
To add, when it comes to video games sometimes people go "baww but 24 fps is enough for film". However, pause a film and you'll see a lot of smearing, not unlike cartoon animation frames I suppose, but in a video game every frame is discrete so low framerate becomes visually a lot more apparent.
I think it was The Hobbit that had a 60 fps version, and people just... weren't having it. It's technologically superior I'm sure (as would higher frame rates be), but it just becomes too "real" then. IIRC they also had to really update their make-up game because on higher frame rates and / or resolutions people can see everything.
Mind you, watching older TV shows nowadays is interesting; I think they were able to scan the original film for e.g. the X Files and make a HD or 4K version of it, and unlike back in the day, nowadays you can make out all the fine details of the actor's skin and the like. Part high definition, part watching it on a 4K screen instead of a CRT TV.
It’s fascinating to me how many of these discussions boil down to dialing in dynamic range for the medium in question.
As the Aladdin still shows with its wildly altered colors clearly other aspects matter/are at play. But the analog/digital discussions always seem, at least to me, to hinge heavily on DR. It’s just so interesting to me.
Many of us remember the leap from SD->HD. Many of us also can point out how 4K is nice and even noticeably better than FHD, but man…getting a 4K OLED TV with (and this is the important part) nice DR was borderline another SD->HD jump to me. Especially with video games and older films shot and displayed on film stock from start to finish. The difference is incredibly striking.
If you're interested in these 35mm film scans, I recommend watching this excellent YouTube video "SE7EN & How 35mm Scans Lie to You" https://www.youtube.com/watch?v=uQwQRFLFDd8 for some more background on how this works, and especially how these comparisons can sometimes be misleading and prey on your nostalgia a bit.
If you're interested in making digital footage look exactly like film in every possible way, I'll shill our product Filmbox: https://videovillage.com/filmbox/
Now there is the problem where many of my friends will take one look at a movie I started on the TV and say "ew, I don't like this movie, it's old" They don't realize they feel that way, viscerally, is because it's shot on film. How do I get people to watch film movies with me? They are far better anyway on average than many modern movies (from a storytelling, moviemaking pov, to say nothing about the picture quality).
Make them into a drinking game. We watched The Princess Bride the other day (never watched it), I think it's aged pretty well but then I'm old. But if they get bored, make it a game to have a drink or get a point or whatever for every sexual innuendo, lol.
Some films didn't age well though.
And finally, accept it and move on, ultimately it's their loss.
>Computer chips were not fast enough, nor disks large enough, nor compression sophisticated enough to display even 30 minutes of standard-definition motion pictures.
This is not true at all. Being compatible with outdated, film based projectors was much more important for being able to show it in as many theaters as possible. If they wanted to do a digital screening it would have been technologically possible.
I bumped on this too, since 1994-1995 was about the time when multi-gigabyte hard drives were readily available and multiple full motion video codecs were being used in games, albeit for cut scenes. Theater projector compatibility makes complete sense.
In 1994-1995, all the pieces for digital cinema were there, but they weren't integrated, and there were no installed projectors. The Phantom Menance was shown digitally.... on two screens. By the end of 2000, there were 31 digital cinema screens in theaters.
Digital cinema went with Motion JPEG2000 with high quality settings, which leads to very large files, but also much better fidelity than likely with a contemporary video codec.
> In 1994-1995, all the pieces for digital cinema were there, but they weren't integrated, and there were no installed projectors.
I agree with that. The article's quote from Pixar's "Making The Cut at Pixar" book was that the technology wasn't there (computer chips fast enough, storage media large enough, compression sophisticated enough) and I--along with the comment I replied to--disagree with that conclusion.
In period I was somewhat in charge of the render queue at a small animation company. I had to get rendered images onto tape, as in Sony Digibeta or better. Before that I had to use film.
We had an incredible amount of fancy toys with no expense spared, including those SGI Onyx Infinite Reality boxes with the specialist video break out boards that did digital video or analogue with genloc. Disks were 2Gb SCSI and you needed a stack of them in RAID formations to play video. This wasn't even HD, it was 720 x 576 interlaced PAL.
We also had to work within a larger post production process, which was aggressively analogue at the time with engineers and others allergic to digital. This meant tapes.
Note that a lot of this was bad for tape machines. These cost £40k upwards and advancing the tape by one frame to record it, then back again to reposition the tape for the next frame, for hours on end, that was a sure way to reck a tape machine, so we just hired them.
Regarding 35mm film, I also babysat the telecine machines where the film bounces up and down on the sprockets, so the picture is never entirely stable. These practical realities of film just had to be worked with.
The other fun aspect was moving the product around. This meant hopping on a train, plane or bicycle to get tapes to where they needed to be. There was none of this uploading malarkey although you could book satellite time and beam your video across continents that way, which happened.
Elsewhere in broadcasting, there was some progress with glorified digital video recorders. These were used in the gallery and contained the programming that was coming up soon. These things had quite a lot of compression and their own babysitting demands. Windows NT was typically part of the problem.
It was an extremely exciting time to be working in tech but we were a long way off being able to stream anything like cinema resolution at the time, even with the most expensive tech of the era.
Pixar and a few other studios had money and bodies to throw at problems, however, there were definitely constraints at the time. The technical constraints are easy to understand but the cultural constraints, such as engineers allergic to anything digital, are hard to imagine today.
Those comparisons were strangely jarring. It's odd to see (on the internet awash with "Mandela Effect" joke conspiracies) direct photo/video evidence that things we remember from our childhood have indeed been changed; sometimes for the worse!
I just showed Toy Story to my kids. It looked really bad. Mostly textures and lighting.
I wonder if artificial grain would actually make it look better.
Like when the game Splinter Cell was released, there weee two additional ‘views’ simulating infrared and thermal cameras. Those had heavy noise added to them and felt so real compared to the main view.
I find a lot of the stuff I remember from decades ago looks worse now. Toy Story in particular I watched when I got a projector after I'd seen Toy Story 4 and it looked bad, almost to the point I wish I hadn't tarnished my memory of it. Similar things have happened with N64 games that I cherished when I was little.
I don't buy that it's a real degradation due to different presentation methods. I'm sorry, but no matter what film stock you lovingly transfer Toy Story to, it's never going to look like it does in your memory. Same with CRTs. Sure, it's a different look, but my memory still looks better.
It's like our memories get automatically upgraded when we see newer stuff. It's jarring to go back and realise it didn't actually look like that in the 90s. I think this is just the unfortunate truth of CGI. So far it hasn't reached the point of producing something timeless. I can watch a real film from the 80s and it will look just as "good" as one from today. Of course the colours will be different depending on the transfer, but what are we hoping for? To get the exact colours the director saw in his mind's eye? That kind of thing has never really interested me.
> Same with CRTs. Sure, it's a different look, but my memory still looks better.
I don’t have this issue and never have. For whatever reason I’ve never “upgraded” them in my mind, and they look today exactly as I remember them when played on period hardware.
The changes in the Aladdin and Lion King stills surely can’t be accidental side effects? The Aladdin shot especially looks like they deliberately shifted it to a different time of day. Could there have been a continuity reason?
The simpsons was originally made in 4:3. Many people don't like watching with large black bars to the right and left, so they show a cropped 16:9 version. People complained because this is occasionally a problem and ruins a joke, so I believe you can opt into either.
A similar crime against taste as the pan-and-scan "fullscreen" DVDs of the early 2000s. If I want to pay to watch something, don't crop out a chunk of what the cinematographer wanted me to see...
It seems like the video examples are unfortunately now unavailable, but the discussion is still interesting and it's neat to see the creative trade-offs and constraints in the process. I think those nuances help evoke generosity in how one approaches re-releases or other versions or cuts of a piece of media.
Pan and scan wasn't a DVD innovation. Most VHS releases were pan and scan too; DVDs at least commonly had widescreen available (many early discs came with widescreen on one side and full screen on the other... good luck guessing if widescreen on the hub indicates the side you're reading is widescren or if the otherside is widescreen so you should have the widescreen label facing up in your player.
Wow. Based on those comparisons they really do feel completely different. Really remarkable how such relatively simple changes in lighting and whatnot can drastically change the mood.
And here I was thinking of re-watching some old Disney/Pixar movies soon :(
TL;DR: Linking to YouTube trailer scans as comparisons for colour is misleading and not accurate.
---
> see the 35 mm trailer for reference
The article makes heavy use of referring to scans of trailers to show what colours, grain, sharpness, etc. looked like. This is quite problematic, because you are replying on a scan done by someone on the Internet to accurately depict what something looked like in a commercial cinema. Now, I am not a colour scientist (far from it!), but I am a motion picture film hobbyist and so can speak a bit about some of the potential issues.
When projected in a movie theatre, light is generated by a short-arc xenon lamp. This has a very particular output light spectrum, and the entire movie process is calibrated and designed to work with this. The reflectors (mirrors) in the lamphouse are tuned to it, the films are colour graded for it, and then the film recorders (cameras) are calibrated knowing that this will be how it is shown.
When a film is scanned, it is not lit by a xenon short-arc lamp, instead various other illumination methods are used depending on the scanner. CRTs and LEDs are common. Commercial scanners are, on the whole, designed to scan negative film. It's where the money is - and so they are setup to work with that, which is very different to positive movie release film stock. Scanners therefore have different profiles to try and capture the different film stocks, but in general, today's workflow involves scanning something in, and then colour correcting post-scan, to meet an artist's expectations/desires.
Scanning and accurately capturing what is on a piece of film is something that is really quite challenging, and not something that any commercial scanner today does, or claims to do.
The YouTube channels referenced are FT Depot, and 35mm Movie Trailers Scans. FT Depot uses a Lasergraphics 6.5K HDR scanner, which is a quite high end one today. It does have profiles for individual film stocks, so you can set that and then get a good scan, but even the sales brochure of it says:
> Many common negative film types are carefully characterized at Lasergraphics to allow our scanning software to compensate for variation. The result is more accurate color reproduction and less time spent color grading.
Note that it says that less time is spent colour grading - it is still not expected that it will accurately capture exactly what was on the film. It also specifies negative, I don't know whether it has positive stock profiles as I am not lucky enough to have worked with one - for this, I will assume it does.
The "scanner" used by 35mm Movie Trailers Scans is a DIY, homemade film scanner that (I think, at least the last time I spoke to them) uses an IMX-183 sensor. They have both a colour sensor and a monochrome sensor, I am not sure what was used to capture the scans linked in the video. Regardless of what was used, in such a scanner that doesn't have the benefit of film stock profiles, etc. there is no way to create a scan that accurately captures what was on the film, without some serious calibration and processing which isn't being done here. At best, you can make a scan, and then manually adjust it by eye afterwards to what you think looks good, or what you think the film looks like, but without doing this on a colour calibrated display with the original projected side-by-side for reference, this is not going to be that close to what it actually looked like.
Now, I don't want to come off as bashing a DIY scanner - I have made one too, and they are great! I love seeing the scans from them, especially old adverts, logos, snipes, etc. that aren't available anywhere else. But, it is not controversial at all to say that this is not colour calibrated in any way, and in no way reflects what one actually saw in a cinema when that trailer was projected.
All this is to say that statements like the following in the article are pretty misleading - as the differences may not be attributable to the direct-digital-release process at all, and could just be that a camera white balance was set wrong, or some post processing to what "looked good" came out different to the original:
> At times, especially in the colors, they’re almost unrecognizable
> Compared to the theatrical release, the look had changed. It was sharp and grainless, and the colors were kind of different
I don't disagree with the premise of the article - recording an image to film, and then scanning it in for a release _will_ result in a different look to doing a direct-digital workflow. That's why major Hollywood films spend money recording and scanning film to get the "film look" (although that's another can of worms!). It's just not an accurate comparison to put two images side by side, when one is of a trailer scan of unknown accuracy.
Damn. I wish we could get the release of the 35mm colors in the way they look in the comparisons. The Aladdin one specifically looks so good! It makes me feel like we're missing out on so much from the era it was released.
This topic is fascinating to me. The Toy Story film workflow is a perfect illustration of intentional compensation: artists pushed greens in the digital master because 35 mm film would darken and desaturate them. The aim was never neon greens on screen, it was colour calibration for a later step. Only later, when digital masters were reused without the film stage, did those compensating choices start to look like creative ones.
I run into this same failure mode often. We introduce purposeful scaffolding in the workflow that isn’t meant to stand alone, but exists solely to ensure the final output behaves as intended. Months later, someone is pitching how we should “lean into the bold saturated greens,” not realising the topic only exists because we specifically wanted neutral greens in the final output. The scaffold becomes the building.
In our work this kind of nuance isn’t optional, it is the project. If we lose track of which decisions are compensations and which are targets, outcomes drift badly and quietly, and everything built after is optimised for the wrong goal.
I’d genuinely value advice on preventing this. Is there a good name or framework for this pattern? Something concise that distinguishes a process artefact from product intent, and helps teams course-correct early without sounding like a semantics debate?
There’s an analog analogue: mixing and mastering audio recordings for the devices of the era.
I first heard about this when reading an article or book about Jimi Hendrix making choices based on what the output sounded like on AM radio. Contrast that with the contemporary recordings of The Beatles, in which George Martin was oriented toward what sounded best in the studio and home hi-fi (which was pretty amazing if you could afford decent German and Japanese components).
Even today, after digital transfers and remasters and high end speakers and headphones, Hendrix’s late 60s studio recordings don’t hold a candle anything the Beatles did from Revolver on.
> There’s an analog analogue: mixing and mastering audio recordings for the devices of the era.
In the modern day, this has one extremely noticeable effect: audio releases used to assume that you were going to play your music on a big, expensive stereo system, and they tried to create the illusion of the different members of the band standing in different places.
But today you listen to music on headphones, and it's very weird to have, for example, the bassline playing in one ear while the rest of the music plays in your other ear.
I know you're looking for something more universal, but in modern video workflows you'd apply a chain of color transformations on top the final composited image to compensate the display you're working with.
So I guess try separating your compensations from the original work and create a workflow that automatically applies them
Theory: Everything is built on barely functioning ruins with each successive generation or layer mostly unaware of the proper ways to use anything produced previously. Ten steps forward and nine steps back. All progress has always been like this.
(Cough) Abstraction and separation of concerns.
In Toy Story's case, the digital master should have had "correct" colors, and the tweaking done in the transfer to film step. It's the responsibility of the transfer process to make sure that the colors are right.
Now, counter arguments could be that the animators needed to work with awareness of how film changes things; or that animators (in the hand-painted era) always had to adjust colors slightly.
---
I think the real issue is that Disney should know enough to tweak the colors of the digital releases to match what the artists intended.
Do you have some concrete or specific examples of intentional compensation or purposeful scaffolding in mind (outside the topic of the article)?
Not scaffolding in the same way, but, two examples of "fetishizing accidental properties of physical artworks that the original artists might have considered undesirable degradations" are
- the fashion for unpainted marble statues and architecture
- the aesthetic of running film slightly too fast in the projector (or slightly too slow in the camera) for an old-timey effect
Great examples. My mind jumps straight to audio:
- the pops and hiss of analog vinyl records, deliberately added by digital hip-hop artists
- electric guitar distortion pedals designed to mimic the sound of overheated tube amps or speaker cones torn from being blown out
Motion blur. 24fps. Grain. Practically everything we call cinematic
Isn't the entire point of "reinventing the wheel" to address this exact problem?
This is one of the tradeoffs of maintaining backwards compatibility and stewardship -- you are required to keep track of each "cause" of that backwards compatibility. And since the number of "causes" can quickly become enumerable, that's usually what prompts people to reinvent the wheel.
And when I say reinvent the wheel, I am NOT describing what is effectively a software port. I am talking about going back to ground zero, and building the framework from the ground up, considering ONLY the needs of the task at hand. It's the most effective way to prune these needless requirements.
enumerable -> innumerable
(opposite meaning)
> (opposite meaning)
Funnily enough, e- means "out" (more fundamentally "from") and in- means "in(to)", so that's not an unexpected way to form opposite words.
But in this case, innumerable begins with a different in- meaning "not". (Compare inhabit or immiserate, though.)
It seems pretty common in software - engineers not following the spec. Another thing that happens is the pivot. You realize the scaffolding is what everyone wants and sell that instead. The scaffold becomes the building and also product.
That’s a great observation. I’m hitting the same thing… yesterday’s hacks are today’s gospel.
My solution is decision documents. I write down the business problem, background on how we got here, my recommended solution, alternative solutions with discussion about their relative strengths and weaknesses, and finally and executive summary that states the whole affirmative recommendation in half a page.
Then I send that doc to the business owners to review and critique. I meet with them and chase down ground truth. Yes it works like this NOW but what SHOULD it be?
We iterate until everyone is excited about the revision, then we implement.
There are two observations I've seen in practice with decision documents: the first is that people want to consume the bare minimum before getting started, so such docs have to be very carefully written to surface the most important decision(s) early, or otherwise call them out for quick access. This often gets lost as word count grows and becomes a metric.
The second is that excitement typically falls with each iteration, even while everyone agrees that each is better than the previous. Excitement follows more strongly from newness than rightness.
Eventually you'll run into a decision that was made for one set of reasons but succeeded for completely different reasons. A decision document can't help there; it can only tell you why the decision was made.
That is the nature of evolutionary processes and it's the reason people (and animals; you can find plenty of work on e.g. "superstition in chickens") are reluctant to change working systems.
Chesterton’s Fence is a related notion.
"Cargo cult"? As in, "Looks like the genius artists at Pixar made everything extra green, so let's continue doing this, since it's surely genius."
There's a similar issue with retro video games and emulators: the screens on the original devices often had low color saturation, so the RGB data in those games were very saturated to compensate. Then people took the ROMs to use in emulators with modern screens, and the colors are over-saturated or just off. That's why you often see screenshots of retro games with ridiculously bright colors. Thankfully now many emulators implement filters to reproduce colors closer to the original look.
Some examples:
https://www.reddit.com/r/Gameboy/comments/bvqaec/why_and_how...
https://www.youtube.com/watch?v=yA-aQMUXKPM
With the GBA, the original GBA screen and the first gen GBA SP had very washed out colors and not saturated at all. The Mario ports to the GBA looked doubly since they desaturated their colors and were shown on a desaturated screen. I've heard that the real reason the colors were desaturated was because the first GBA model didn't have a backlight so the colors were lightened to be more visible, but I'm not quite sure that's the case. Lots of other games didn't do that.
And with the second version of the GBA SP and the GB Micro, colors were very saturated. Particularly on the SP. If anything, cranking up the saturation on an emulator would get you closer to how things looked on those models, while heavily desaturating would get you closer to the look on earlier models.
Ah yes, we often get folks in the nesdev community bickering over which "NES Palette" (sourced from their favorite emulator) is the "best" one. The reality is extraordinarily complicated and I'm barely qualified to explain it:
https://www.nesdev.org/wiki/PPU_palettes#2C02
In addition to CRTs having variable properties, it turns out a lot of consoles (understandably!) cheat a little bit when generating a composite signal. The PPU's voltages are slightly out of spec, its timing is weird to work around a color artifact issue, and it generates a square wave for the chroma carrier rather than an ideal sine wave, which produces even more fun problems near the edges. So we've got all of that going on, and then the varying properties of how each TV chooses to interpret the signal. Then we throw electrons at phosphors and the pesky real world and human perception gets involved... it's a real mess!
https://www.youtube.com/watch?v=2sxKJeYSBmI
This video is related to that issue
Final Fantasy Tactics on Game Boy Advance had a color mode for television.
https://www.youtube.com/shorts/F29nlIz_tWo
Nice, and two LCD modes to adapt to different GBA screens! (presumably the GBA and GBA SP first model, vs the GBA SP second model with backlight)
Aha! I used to work in film and was very close to the film scanning system.
When you scan in a film you need to dust bust it, and generally clean it up (because there are physical scars on the film from going through the projector. Theres also a shit tone of dust, that needs to be physically or digitally removed, ie "busted")
Ideally you'd use a non-real time scanner like this: https://www.filmlight.ltd.uk/products/northlight/overview_nl... which will collect both colour and infrared. This can help automate dust and scratch removal.
If you're unluckly you'll use a telecine machine, https://www.ebay.co.uk/itm/283479247780 which runs much faster, but has less time to dustbust and properly register the film (so it'll warp more)
However! that doesnt affect the colour. Those colour changes are deliberate and are a result of grading. Ie, a colourist has gone through and made changes to make each scene feel more effective. Ideally they'd alter the colour for emotion, but that depends on who's making the decision.
the mechanics are written out here: https://www.secretbatcave.co.uk/film/digital-intermediary/
> However! that doesnt affect the colour.
That has been something I've wondered about since seeing frame comparisons of (probably) telecine'ed prints of The Matrix vs. the myriad home video releases.
I'm a colorist and it absolutely does effect color. Every telecine is different and will create a different looking scan. Telecine operators will do a one light pass to try and compensate but any scan needs to be adjusted to achieve what the artist's original vision was.
How much of the colour change is also dependent on the film printer and also film scanner/telecine?
It just seems like there’s a lot of variability in each step to end up with an unintended colour, that will taken as the artist’s intent.
How did you dust bust it? Wipe it by hand with a microfiber cloth or something?
In optics & film usually blowing air is employed, as wiping runs the risk of further scratches in the case of an abrasive particle (e.g. sand)
There are handheld tools (google hand blower bulb), but I would imagine film scanning uses something less manual
> Theres also a shit tone of dust, that needs to be physically or digitally removed, ie "busted"
Is that because you're just leaving the film out in a big pile, or because it decays rapidly?
I would have expected film to be stored in containers.
Someone correct me if I'm wrong, but I believe it builds a static charge as it runs through the projector and attracts dust. I say this because I remember holding my hand near moving film in our (home) movie projector, and as a kid enjoying feeling the hairs on my arm standing up from the static. Maybe professional gear protects against that somehow, but if not that'd be why.
It's a surprisingly common error where someone picks up an old 35mm print and assumes it is somehow canonical... Besides whatever the provenance of these prints are (this gets complicated) the reality is that these were also made to look at best as they could for typical movie theater projector systems in the 90s. These bulbs were hot and bright and there were many other considerations around what the final picture would look like on the screen. So yeah, if you digitize 35mm film today, it will look different, and different from how its ever been been displayed in a movie theater.
Agreed. It's a fine article but leaves half the story on the table. It is supposedly comparing what these movies looked like in the theater to the modern streaming and bluray versions, but is actually comparing what a film scan (scanner settings unspecified) projected on a TV (or other unspecified screen) looks like compared to the digital versions on (presumably) the same screen. And then we can ask: how were the comparison images captured, rendered to jpeg for the web, before we the readers view them on our own screens? I'm not arguing Catmull and company didn't do a great job of rendering to film, but this comparison doesn't necessarily tell us anything.
Don't believe me? Download the comparison pictures in the article to your device and play with filters and settings. You can get almost anything you want and the same was true at every step in the render pipeline to your TV.
Ps - and don't get me started on how my 60-year old eyes see color to what they perceived when I saw this in the theater
It’s an interesting and valid point that the projectors from the time would mean current scans of 35mm will be different too. However, taking for example the Aladdin screenshot in particular, the sky is COMPLETELY the wrong colour in the modern digital edition, so it seems to me at least that these 35mm scans whilst not perfect to the 90’s are closer to correct than their digital counterparts.
And as someone who is part of those conservation communities that scan 35mm with donations to keep the existing look, a lot of the people doing those projects are aware of this. They do some color adjustment to compensate for print fading, for the type of bulb that were used in movie theatres back then (using a LUT), etc...
I do find that often enough commercial releases like Aladdin or other movies like Terminator 2 are done lazily and have completely different colors than what was historically shown. I think part of this is the fact that studios don't necessarily recognise the importance of that legacy and don't want to spend money on it.
Whats wrong with terminator 2?
Are there like multiple digital releases, one with better colour than the other?
There's a 4K version out that does interesting things with colour grading, here's a post I found: https://www.reddit.com/r/Terminator/comments/d65pbi/terminat.... The one on the left is the remaster.
There was similar outrage (if that's the right word) about a Matrix remaster that either added or removed a green color filter, and there's several other examples where they did a Thing with colour grading / filtering in a remaster.
To me, that just looks like what happens when I try to play HDR content on a system that doesn't know about HDR. (It looks like you're watching it through sunglasses.)
There's multiple versions of the Matrix on the trackers and the internet that I know of. The official release all look kinda different to each other:
https://www.youtube.com/watch?v=1mhZ-13HqLQ
There's a 35mm scan floating around from a faded copy with really weird colors sometimes
https://www.youtube.com/watch?v=Ow1KDYc9XsE
And there's an Open Matte Version, which I don't know the Origin of.
https://www.youtube.com/watch?v=Z2eCmhBgsyI
For me, it's the Open Matte that I consider the ultimate best version.
See my top level comment for more info on this, but the Aladdin scan used in the article was from a 35mm trailer that's been scanned on an unknown scanner, and had unknown processing applied to it. It's not really possible to compare anything other than resolution and artefacts in the two images.
At the same time, I think the nostalgia people feel for those versions isn't necessarily about accuracy, it's about emotional fidelity
And it was made by a lab that made choices on processing and developing times, that can quite easily affect the resulting image. You hope that labs are reasonably standard across the board and calibrate frequently, but even processing two copies of the same material in a lab, one after the other will result in images that look different if projected side by side. This is why it's probably impossible to made new prints of 3-strip-cinerama films now, the knowledge and number of labs that can do this are near zero.
This reminds me of how pre-LCD console games don't look as intended on modern displays, or how vinyl sounds different from CDs because mixing and mastering targeted physical media with limitations.
Wasn't CD more so cheapening out? Doing work one time and mostly for radio where perceived listening scenario was car or background and thus less dynamic range allowed it be louder on average.
CD itself can replicate same dynamic range and more, but well that doesn't sell extra copies.
The loudness war was a thing in all media. In the 80s most of us didn't have CD players but our vinyl and tapes of pop and rock were all recorded overly loud. Compared to the classical and jazz recordings, or perhaps the heyday of audiophile 70s rock, it was annoying and sad.
> It's a surprisingly common error where someone picks up an old 35mm print and assumes it is somehow canonical
Same applies for people buying modern vinyl records believing them to be more authentic than a CD or (god-forbid) online streaming.
Everything comes from a digital master, and arguably the vinyl copy adds artefacts and colour to the sound that is not part of the original recording. Additionally, the vinyl is not catching more overtones because it's analogue, there is no true analogue path in modern music any more.
I don't know if this is still true, but I know that in the 2000s the vinyls usually were mastered better than the CDs. There even was a website comparing CD vs vinyl releases, where the person hosting it was lamenting this fact because objectively CDs have a much higher dynamic range than vinyls, although I can't find it now. CDs were a victim of the loudness war[0].
Allegedly, for a lot of music that is old enough the best version to get (if you have the kind of hifi system that can make use of it) is an early 80s CD release, because it sits in a sweet spot of predating the loudness war where producers actually using the dynamic range of the CD.
[0] https://en.wikipedia.org/wiki/Loudness_war
The loudness wars were mostly an artifact of the 90s-2010s, because consumers were listening on horrible plasticky iPod earbuds or cheap Logitech speakers and the music had to sound good on those.
Once better monitors became more commonplace, mastering became dynamic again.
This is most clear with Metallica's Death Magnetic, which is a brickwalled monstrosity on the 2008 release but was fixed on the 2015 release[0]. And you can see this all over, where albums from the 90s had a 2000s "10-year anniversary" remaster that is heavily compressed, but then a 2010s or 2020s remaster that is dynamic again.
[0] Interestingly enough between those dates, fans extracted the non-brickwalled Guitar Hero tracks and mastered them as well as they could. Fun times :).
I dunno about authentic but for a while (as another commenter pointed out) they didn't have the loudness maxed out and / or had better dynamic range. That said, music quality aside, vinyls have IMO better collectability value than CDs. They feel less fragile, much more space for artwork and extras, etc.
I think the entire premise of the article should be challenged. Not only is 35mm not meant to be canonical, but the 35mm scans the author presented are not what we saw, at least for Aladdin.
I've watched Aladdin more than any as a child and the Blu-ray screenshot is much more familiar to me than the 35mm scan. Aladdin always had the velvia look.
> Early home releases were based on those 35 mm versions.
Here's the 35mm scan the author presents: https://www.youtube.com/watch?v=AuhNnovKXLA
Here's the VHS: https://www.youtube.com/watch?v=dpJB7YJEjD8
Famously CRT TVs didn't show as much magenta so in the 90s home VHS releases compensated by cranking up the magenta so that it would be shown correctly on the TVs of the time. It was a documented practice at the time.
So, yes the VHS is expected to have more magenta.
Anecdotally, I remember watching Aladdin at the movie theatre when it came out and later on TV multiple times and the VHS you saw doesn't correspond to my memories at all.
The author here is asserting that VHS were based on the 35mm scans, and that the oversaturation is a digital phenomena. Clearly, that's not true.
I can't challenge the vividness of your memory. That's all in our heads. I remember it one way, and you remember it another.
For sure, the author simplified things for the article. Anyway, in the case of VHS, they were indeed based on the 35mm scan but then had additional magenta added (as well as pan and scan to change the aspect ratio).
The author is not wrong that oversaturation is a source transfer phenomena (which will always be different unless special care is taken to compare with the source material).
On most TVs that magenta wouldn't have shown as much as the youtube video shows because TVs tended to have weaker magentas. Of course, it's not like TVs were that uniformly calibrated back then and there were variations between TVs. So depending on the TV you had, it might have ended up having too much magenta but that would have usually been with more expensive and more accurate TVs.
TLDR: Transfers are hard, any link in the chain can be not properly calibrated, historically some people in charge of transferring from one source to another compensated for perceived weak links in the chain.
The magenta thing is interesting. I learned something new. Reading the other comments, this is seems to be as much a tale of color calibration as much as anything.
Regarding my memory, it becomes shakier the more I think about it. I do remember the purples but me having watched the cartoon could have affected that.
It sounds like in the case of Toy Story, the Pixar team were working toward a 35mm print as the final product, so that probably should be considered canonical: it's what the creative team set out to make.
I prefer the 35mm version, at least when viewing the scene with the soldiers.
This makes so much more sense now. After having kids I've been watching my fair share of Pixar and I just never recalled how flat and bland everything looked but I would always chalk it up to my brain not recalling how it looked at the time. Good to know I guess that it wasn't just entirely nostalgia but sad that we continue to lose some of this history and so soon.
It's kind of sad that what felt like a defining aesthetic at the time is now basically an accidental casualty of progress
Things like this are being preserved, you just have to sail the high seas.
Yeah I clicked this link going “oh god it’s because they printed to film, I bet, and man do I hope it looks worse so I don’t have to hunt down a bunch of giant 35mm scans of even more movies that can’t be seen properly any other way”
But no, of course it looks between slightly and way better in every case. Goddamnit. Pour one out for my overworked disk array.
And here I was thinking it was just my imagination that several of these look kinda shitty on Blu-ray and stream rips. Nope, they really are worse.
Piracy: saving our childhoods one frame at a time.
When it comes to Star Wars, people are literally spotting them in Photoshop frame by frame. :)
I'm not sure why you're getting downvoted. What you're hinting at is that a lot of original 35mms are now getting scanned and uploaded privately, especially where all the commercial releases on Blu-ray and streaming are based on modified versions of the original movies, or over-restored versions.
These can be especially hard to find as the files are typically enormous, with low compression to keep things like grain. I see them mostly traded on short-lived gdrives and Telegram.
> I see them mostly traded on short-lived gdrives and Telegram.
Someone tell this community to share over BT. Aint nobody got time to keep up with which platform/server everyone is on and which links are expired and yuck.
The main reason they are not shared as widely is that there's a bit of conflict within the community between those that really want to stay under the radar and not risk being targeted by copyright owners (and so try to keep things very much private between the donors who funded the 600-900 usd cost of the scans) and those who want to open up a bit more and so use telegram, reddit and upload to private trackers.
I would be surprised if they didn't end up on the prestigious private trackers
> with low compression to keep things like grain.
But you have algorithmic grain in modern codecs, so no need to waste so much space for noise?
This grain looks extremely fake.
Because one is genuine physics and another is a fake crap?
the calculations and the photons sent to your eyes are all genuine physics
One’s an accurate recording of how a real thing looked.
The other’s fake noise.
One’s a real photo from 1890. The other’s an old-timey instagram filter.
It makes sense that some folks might care about the difference. Like, I love my old family Polaroids. I would not want a scanned version of those to have the “noise” removed for compression’s sake. If that had been done, I’d have limited interest in adding fake noise back to them. By far my favorite version to have would be the originals, without the “noise” smoothed out at all.
Lots of folks have similar feelings about film. Faked grain isn’t what they’re after, at all. It’s practically unrelated to what they’re looking for.
> One’s an accurate recording of how a real thing looked.
> The other’s fake noise
But since there is no such thing as the real thing, it could just as well match one of the many real noise patterns in one of the many real things floating around, or a real thing at a different point in time with more/less degradation. And you wouldn't even know the difference, thus...
> It makes sense that some folks might care about the difference
Not really, it doesn't make sense to care about identical noise you can't tell apart. Of course, plenty people care about all kind of nonsense, so that won't stop those folks, but let's not pretentd there is some 'real physics' involved
But… of course there is? A record of a real thing is different from a statistical simulation of it.
I think you missed the "a" vs " the", you can encode different sources that would have different grains, or the same source would have different grain at different times.
But also a simulation called compression of a real thing is different from that real thing, so that purity test had already been failed
I just feed AI the IMDB summary and let it re-create the movie for me. Just as “pure” as high-bitrate h.265, after all.
You've chosen your argumentative perch very well, it's indeed right down there with the AI slop where you can't see any difference in reality
Well film grain doesn't matter because compression exists, apparently, and may as well be simulated because it's already failed the "purity test" and may as well be algo-noise. That holds for everything else! May as well maximize the compression and simulate all of it then.
[EDIT] My point is "film grain's not more-real than algo noise" is simply not true, at all. An attempt to represent something with fidelity is not the same thing as giving up and faking it entirely based on a guess with zero connection to the real thing—its being a representation and not the actual-real-thing doesn't render it equally as "impure" as a noise-adding filter.
You're still dancing there in the slop, hallucinating the arguments thinking it's a pretty dress!
It may as well be stimulated because you won't see the difference! So now you've imagined some purity test which was never true, so you have nothing and start hallucinating some hyperbolic AI thing
> But also a simulation called compression of a real thing is different from that real thing, so that purity test had already been failed
Quoted: the introduction of “purity test” to the conversation, from not one of my posts.
You can’t trust corporations to respect or protect art. You can’t even buy or screen the original theatrical release of Star Wars. The only option is as you say. There are many more examples of the owners of IP altering it in subsequence editions/iterations. This still seems so insane to me that it’s not even for sale anywhere…
I don't understand why you're getting downvoted. So many beautiful things have been lost to perpetual IP, e.g. old games that could be easily ported by volunteers given source code, which can never be monetised again.
Sometimes people create things that surpass them, and I think it is totally fair for them to belong to humanity after the people that created them generated enough money for their efforts.
> You can’t even buy or screen the original theatrical release of Star Wars
You can actually, the 2006 Limited Edition DVD is a double disc version one being the original version.
However they are not DVD quality because they were transferred from LaserDisc and not the original film stock
Even those aren’t accurate to the 1977 film.
To pick an arguably-minor but very easy to see point: the title’s different.
Would be annoying, but I suppose you could also recalibrate your display to turn down the greens?
VLC has a lot of image manipulation options.
What sort of terms might one search for?
"toy story film scan" on Kagi led me to a reddit page that may or may not contain links that might help you, but don't dawdle those links may not work forever.
Another one that's been hard to find is the 4k matrix original color grading release. Ping me if you have it! (Not the 1080p release)
You'd think there was a LUT you could apply to the digital copies during playback to make it look (more) like the original...
I'm surprised they can't just put a filter on the digital versions to achieve a similar look and feel to the 35mm version.
It is clear that the animators factored in the colour changes from the original media to 35mm, so it seems a disservice to them to re-release their works without honouring how they intended the films to be seen.
They could, but it would require some work to get it right. This is very similar to conversations that happen regularly in the retro game scene regarding CRT monitors vs modern monitors for games of a certain era. The analog process was absolutely factored in when the art was being made, so if you want a similar visuals on a modern screen you will need some level of thoughtful post processing.
Disney 100% has access to colorists and best in class colour grading software. It must have been a business (cost cutting) decision?
They could reduce the saturation with 1 mouse click if they wanted, but they didn't. They must have intentionally decided that high saturation is desirable.
I’m reminded of the beginning of the movie Elf, where the book publisher is informed that a printing error means their latest book is missing the final two pages. Should they pulp and reprint? He says,
> You think a kid is going to notice two pages? All they do is look at the pictures.
I’m quite sure bean counters look at Disney kids movies the exact same way, despite them being Disney’s bread and butter.
With Star Wars you have a dedicated adult fan base that’ll buy up remasters and reworkings. Aladdin? Not so much. Especially in the streaming era, no one is even buying any individual movie any more.
I'm a 39 year old man who ground his VHS of Aladdin to dust in the 90s, and bought the Blu Ray because I can't say I can rely on streaming to always exist.
> With Star Wars you have a dedicated adult fan base that’ll buy up remasters and reworkings. Aladdin? Not so much. Especially in the streaming era, no one is even buying any individual movie any more.
I agree it was likely Disney being cheap, but there are tons of people who'll buy up disney movies on physical media in the age of streaming. Not only are there disney fans who'd rival the obsessiveness of star wars fans, but like Lucas Disney just can't leave shit alone. They go back and censor stuff all the time and you can't get the uncensored versions on their streaming platform. Aladdin is even an example where they've made changes. It's not even a new thing for Disney. The lyrics to one of the songs in Aladdin were changed long before Disney+ existed.
Steve Jobs' type attitude vs Bill Gates type attitude (in the 90s). Or, Apple vs Microsoft.
The Disney of yesterday might have been a bit more Jobs than Gates, compared to the Disney of today.
The vast majority of people will not care nor even notice. Some people will notice and say, hey, why is it "blurry." So do you spend a good chunk of time and money to make it look accurate or do you just dump the file onto the server and call it a day?
To speak nothing of the global audience for these films. I'm guessing most people's first experience seeing these movies was off a VHS or DVD, so the nostalgia factor is only relevant to small percentage of viewers, and only a small percentage of that percentage notices.
VHS resolution is total crap... yet: it's not uncommon for the colors and contrast on VHS (and some early DVD) to be much better than what is available for streaming today.
This is totally bonkers, because the VHS format is crippled, also color wise. Many modern transfers are just crap.
It’s really amazing how some Blu-ray do in fact manage to be net-worse than early dvd or even vhs, but it’s true.
An infamous case is the Buffy the Vampire Slayer tv show. The Blu-ray (edit: and streaming copies) went back to the film source, which is good, but… that meant losing the color grading and digital effects, because the final show wasn’t printed to film. Not only did they get lazy recreating the effects, they don’t seem to have done scene-by-scene color grading at all. This radically alters the color-mood of many scenes, but worse, it harms the legibility of the show, because lots of scenes were shot day-for-night and fixed in post, but now those just look like they’re daytime, so it’s often hard to tell when a scene is supposed to be taking place, which matters a lot in any show or film but kinda extra-matters in one with fucking vampires.
The result is that even a recorded-from-broadcast VHS is arguably far superior to the blu ray for its colors, which is an astounding level of failure.
(There are other problems with things like some kind of ill-advised auto-cropping seeming to have been applied and turning some wide shots into close-ups, removing context the viewer is intended to have and making scenes confusing, but the colors alone are such a failure that a poor VHS broadcast recording is still arguably better just on those grounds)
How can we get away from this mindset as a society, where craft and art are sacrificed at the altar of "it's not monetarily worth it."
There's a fucking lot of things that are not worth it monetarily, but worth it for the sake of itself. Because it's a nice gesture. Or because it just makes people happy. Not to sound like some hippie idealist, but it's just so frustrating that everything has to be commoditized.
It’s really been the driving force of modern life for centuries at this point.
Centuries is stretching it. It’s central to industrialisation, Taylor, Ford, etc. The relentless pursuit of efficiency and technique. Its anti-thesis is art for art’s sake.
In modern tech circles, the utilitarian mindset is going strong, now that the hacker ethos is dead and it’s all about being corporate friendly and hireable.
Yeah the industrialised world wasn't maligned by Blake as 'dark Satanic mills' or as Mordor by Tolkien because they found it an artistically fulfilling place.
> How can we get away from this mindset as a society, where craft and art are sacrificed at the altar of "it's not monetarily worth it."
Honestly, by weakening copyright protections. People who love the works will do the work to protect them when they don't have to fear being sued into bankruptcy for trying to preserve their own culture.
You can sit down and recolor the movie frame by frame and release it on torrent yourself, it'll make many people happy. It won't be worth it monetarily but since you're annoyed it doesn't exist and money isn't a factor...
It's always easy to complain about others not being generous enough with their time, but we always have an excuse for why we won't do it ourselves.
You can't do that since besides time you also need knowledge/skill. So the final difference could be between "an extra 1% of the budget" at a corporate level vs "and extra 10% of your life to become a professional and fix a video, and also break the law in the process". Pretty easy to see how it's not just "an excuse", but a bit more fundamental issue
I'm this particular instance though it's not really about time, it's studios not wanting to pay what I imagine would be a relatively small amount to do the conversion. It's not going to be a frame-by-frame laborious process.
> You can sit down and recolor the movie frame by frame and release it on torrent yourself, it'll make many people happy.
You can't, at least not if you want an acceptable result.
In photography, if you have a JPEG photo only, you can't do post-facto adjustments of the white balance, for that you need RAW - too much information has been lost during compression.
For movies it's just the same. To achieve something that actually looks good with a LUT (that's the fancy way for re-coloring, aka color grading), you need access to the uncompressed scans, as early in the processing pipeline as you can get (i.e. before any kind of filter is applied).
Just dialing down the red and blue channels a bit makes it much closer for several of the early '90s releases (look at that Aladdin example from TFA)
Disney do pay for industry leading colorists. They chose to favour a more saturated look for Aladdin et al. It is reasonable to prefer either. I can't imaging what happened to the greens in the Toy Story examples if they are accurate.
And ultimately, what you need to achieve acceptable CRT effects is resolution. Only now, with 4K and above, can we start to portray the complex interactions between the electron beam and the produced image by your console. But the colour banding that caused the hearts of The Legend of Zelda to show a golden sheen is still unreachable.
Reminded me of this article about some retro games on crt vs lcd-
https://wackoid.com/game/10-pictures-that-show-why-crt-tvs-a...
It's not just about slapping on some grain and calling it a day; it's about honoring a whole set of artistic decisions that were made with that specific medium in mind
You can, that's what Pixar did while creating the film. From the article:
> During production, we’re working mostly from computer monitors. We’re rarely seeing the images on film. So, we have five or six extremely high-resolution monitors that have better color and picture quality. We put those in general work areas, so people can go and see how their work looks. Then, when we record, we try to calibrate to the film stock, so the image we have on the monitor looks the same as what we’ll get on film.
But they didn't do a perfect job (the behavior of film is extremely complex), so there's a question- should the digital release reflect their intention as they were targeting these calibrated monitors or should it reflect what was actually released? Also, this wouldn't include other artifacts like film grain.
> Then, when we record, we try to calibrate to the film stock, so the image we have on the monitor
Except, as they say, the high grade monitors were calibrated to emulate the characteristics of film.
If we can show that D+ doesn't look like the film, then we can point out that it probably doesn't look like the calibrated monitors either. Those army men are not that shade of slime green in real life, and you'll have a hard time convincing me that after all the thought and effort went in to the animation they allowed that putrid pea shade to go through.
The best option for me would be to release it in whatever format preserves the most of the original colour data without modification, then let the viewer application apply colour grading. Give me the raw renders in a linear 16bpc colour space with no changes. Sadly, I don't think we have digital movie formats that can handle that.
It is doable and you can get presets designed to mimic the look of legendary photography film stock like Velvia. But what they did back then was very much an analog process and thus also inherently unstable. Small details start to matter in terms of exposure times, projectors used etc. There’s so many frames and it took so much time, that it’s almost guaranteed there’d be noticeable differences due to process fluctuations.
It's even worse with The Matrix where nobody is even sure any more how it was supposed to look, except definitely not as green as the DVD release.
Noodle made a charming video about going mad researching this: https://www.youtube.com/watch?v=lPU-kXEhSgk
I've grown very fond of having shaders available for my retro games.
I suspect having shader plugins for TV and movie watching will become a thing.
"The input is supposed to be 24 FPS, so please find those frames from the input signal. Use AI to try to remove compression artifacts. Regrade digital for Kodak 35mm film. Then, flash each frame twice, with blackness in-between to emulate how movie theaters would project each frame twice. Moderate denoise. Add film grain."
I don't actually know what kind of filters I'd want, but I expect some people will have very strong opinions about the best way to watch given movies. I imagine browsing settings, like browsing user-contributed Controller settings on Steam Deck...
Neat! The Youtube channel Noodle recently did a related deep dive into the differences in the releases of The Matrix [0]. The back half of the video also touches on the art of transferring from film/video to digital.
[0]: https://www.youtube.com/watch?v=lPU-kXEhSgk
I always felt the old matrix had a more colder blue. And it changed drastically when the second and third hit cinemas. At least that was my memory because I watched a double feature when the second one hit the theatre's and complained then that the Matrix somehow looked weird. But it could also be my memory since I also own the blue ray release.
Another movie with the same / similar problem is the DVD release of the Lord of the Rings Extended editions. Both Blu-ray and 4K version. As far as I remember is that they fixed it for the theatrical version in 4K but not extended.
At least they messed around with the green.
https://www.youtube.com/watch?v=XR0YBqhMtcg
It's wild to realize that the version of the movie most of us remember emotionally is not the one that's currently streaming. There's something bittersweet about that... like your memory has a certain warmth and texture that modern restorations just can't quite recreate.
Film weave could also be worth mentioning.
Movies projected on film look different not only because of the color and texture, but also a constant spatial jitter over time. When the film moves through the projector, each frame locks into a slightly different position vertically. That creates a wobble that's called "film weave."
(If you want to create truly authentic-looking titles for a 1980s B-grade sci-fi movie, don't forget to add that vertical wobble to your Eurostile Extended Bold layout that reads: "THE YEAR IS 2025...")
Film weave is such an underrated part of that analog feel
The texture of the film grain makes Mulan and Aladdin really look better. The large simple filled sections look like they have so much more to them.
The one frame they showed from the Lion King really stood out. The difference in how the background animals were washed out by the sunlight makes the film version look significantly better.
I'm not sure if I'm just young enough to be on the other side of this despite seeing all three of those Disney movies as a millennial kid (Lion King and Aladdin were VHS mainstays in my house, and I remember seeing Mulan in theaters), but I honestly don't find the film grain to look better at all and think all three of those bottom images are much more appealing. For the Toy Story ones, I think I'm mostly indifferent; I can see why some people might prefer the upper film images but don't really think I'd notice which one I was watching. I'd definitely think I'd notice the difference in the 2D animation though and would find the film grain extremely distracting.
To me it's much worse. You can't see all of the detail the artists drew, and there is noise everywhere, even specs of dust.catches. Whenever I watch a film based movie my immersion always gets broken by all the little specs that show up. Digital is a much more immersive experience for me.
> To me it's much worse. You can't see all of the detail the artists drew, and there is noise everywhere, even specs of dust.catches.
In the lion king example you weren't meant to see all of the detail the artists drew. In the Army men example the color on the digital version is nothing like the color of the actual toys.
They originally made those movies the way they did intentionally because what they wanted wasn't crystal clear images with unrealistic colors, they wanted atmosphere and for things to look realistic.
Film grain and dust can be excessive and distracting. It's a good thing when artifacts added due to dirt/age gets cleaned up for transfers so we can have clear images, but the result of that clean up should still show what the artists originally intended and that's where disney's digital versions really miss the mark.
This is an interesting take when you look at the gas station Toy Story example and consider the night sky. In the digital version the stars are very washed out but in the film version the sky is dark and it's easy to appreciate the stars. Perhaps it's unrealistic when you realize the setting is beneath a gas station canopy with fluorescent lights, but that detail, along with some of the very distinct coloring, stuck out to me.
Which is of course highly subjective; you could argue that film grain is an unwanted but unavoidable side-effect from the medium used, just like other artifacts from film - vertical alignment issues, colour shifting from "film breath", 24 frames per second, or the weird sped-up look from really old films.
I don't believe these were part of the filmmaker's vision at the time, but unavoidable. Nowadays they are added again to films (and video games) on purpose to create a certain (nostalgic) effect.
It does, but much more important to me is the color grading. The white point in the film versions is infinitely better.
Same is true of home video hardware:
If you plug a Nintendo system's RCA cables into a modern TV, it will look like garbage. Emulated games on LCDs look pixelated.
Those games were designed for a CRT's pixel grid. They don't look right on LCDs, and the upscalers in home theater equipment don't respect that. There are hardware upscalers and software shaders that are specifically designed to replicate a CRT's quirks, to let you better approximate how those games were designed to be played.
Related - someone recently built a CRT dock for his Switch, so he could play Nintendo Switch Online's emulated games as originally intended:
https://www.youtube.com/watch?v=wcym2tHiWT4
I'm stunned so many people here can remember details as fine as the colour grading of a film. I couldn't remember specifics like that from 6 months ago, let alone 30 years ago when I was a child and wouldn't have had the thought to watch for cinematographic touches.
Side node - I wonder if it's a millenial thing that our memories are worse due to modern technology, or perhaps we are more aware of false memories due to the sheer availability of information like this blog post.
At least for me it's not so much details like color grading over the entire film, it's more like a specific scene got burned into memory. Movie looks pretty much fine until reaching that scene and it's so different it's like something got shaken loose and you start to see the larger issues.
For an example people here might be more familiar with, it's like how you can't even see bad kerning until you learn what it is, then start to notice it a lot more.
I am not a huge gamer - maybe a dozen hours a year. But I feel that, say, Mario responds differently to controls in an emulator than how I remember Mario responding on an NES with a CRT.
But I was never very good, and it has been decades, so I don't know how much of this is just poor memory - I actually don't think I'm good enough/play enough that the latency of modern input/displays makes a difference at my level.
I would love to try both side-by-side to see if I could pick out the difference in latency/responsiveness.
I doubt many people 'remember' this to any significant extent, but there are probably many cases of media giving the 'wrong' vibe with a new release, and you just assume it's because you've gotten older, but then when you get access to the original you experienced, the 'good' vibe is back, and you can easily compare between the two.
Although some people do infact remember the differences, but I'd guess a lot of those incidents are caused by people experiencing them in fairly quick succession. It's one thing to remember the difference between a DVD 20 years ago and a blu-ray you only watched today, and another to watch a DVD 15 years ago and a blu-ray 14 years ago.
They probably can't.
Different people just remember different things. I bet most people don't remember either and only going "ah yes of course!" after reading this blogpost (which means they didn't remember at all).
Anecdata here, but I played Zelda Ocarina of Time on CRT when I was a child, and have since replayed it many times via emulator. The game never looked quite as good as I remembered it, but of course I chalked it up to the fact that graphics have improved by several orders of magnitude since then.
Then a few years ago I was throwing out my parent's old CRT and decided to plug in the N64 one last time. Holy crap was it like night and day. It looked exactly as I remembered it, so much more mysterious and properly blended than it does on an LCD screen.
I don't see why the same wouldn't apply to films, sometimes our memories aren't false.
Film is magical. We should preserve and incentivise it.
Personally, I prefer film versions in every example listed.
> He [David DiFrancesco] broke ground in film printing — specifically, in putting digital images on analog film.
> Their system was fairly straightforward. Every frame of Toy Story’s negative was exposed, three times, in front of a CRT screen that displayed the movie.
While I have no doubt that this hadn't been done at the scale and resolution, it struck me that I'd heard about this concept in a podcast episode [1] in which very early (1964) computer animation was discussed alongside the SC4020 microfilm printer that used a Charactron CRT which could display text for exposure to film or plot lines.
[1] https://adventofcomputing.libsyn.com/episode-88-beflix-early...
Is it possible to replicate the digital->film transition with tone mapping? (I assume the answer is yes, but what is the actual mapping?)
Generally yes, but we're still working on it all these years later! This article by Chris Brejon offers a very in-depth look into the differences brought about by different display transforms: https://chrisbrejon.com/articles/ocio-display-transforms-and...
The "best" right now, in my opinion, is AgX, which at this point has various "flavours" that operate slightly differently. You can find a nice comparison of OCIO configs here: https://liamcollod.xyz/picture-lab-lxm/CAlc-D8T-dragon
How well does 35mm hold up over time? Could these movies be said to “no longer exist” in some sense, if the scans have decayed noticeably?
Playing them, handling them, and poor storage all degrade them. Most original prints will have been played many times, and often haven’t been consistently stored well.
The 4k77 et c. fan scans of the original Star Wars trilogy, which aimed to get as close as possible to what one would have seen in a theater the year of release, used multiple prints to fill in e.g. bad frames, used references like (I think) magazine prints of stills and well-preserved fragments or individual frames to fix the (always faded, sometimes badly) color grading and contrast and such, and had to extensively hand-correct things like scratches, with some reels or parts of reels requiring a lot more of that kind of work than others. Even Jedi required a lot of that sort of work, and those reels would have been only something like 30-35 years old when they started working on them.
Hollywood does store original prints in underground salt mines (at least I am aware of a place in Kansas where they do this). Of course who knows where the frames we are being shown from the 35mm film version are coming from. Likely not these copies that are probably still in halite storage.
So it's fascinating reading this looking at the screengrabs of the "original" versions... not so much because they are "how I remember them" but indeed, because they have a certain nostalgic quality I can't quite name - they "look old". Presumably this is because, back in the day, when I was watching these films on VHS tapes, they had come to tape from 35mm film. I fear I will never again be able to look at "old looking" footage with the same nostalgia again, now that I understand why it looks that way - and, indeed, that it isn't supposed to look that way!
Beauty and the Beast on Bluray looks completely different from what I remember; I had assumed that they had just regraded it, but given that it was developed with CAPS, maybe this is part of the effect?
Man, this makes me want to watch original 35mm releases of all these films. It is unfortunate that they are so hard to get your hands on these days.
This thread has a download link for toy story 35mm. Not sure if it works but maybe worth trying.
https://www.reddit.com/r/toystory/comments/1hhfuiq/does_anyo...
I call it the Newgrounds animation effect. Digital-to-digital always looked a bit unserious to me.
I’m sure many young people feel the exact opposite.
> "Even so, it’s a little disquieting to think that Toy Story, the film that built our current world, is barely available in the form that wowed audiences of the ‘90s."
Load it up in DaVinci Resolve, knock the saturation and green curve down a bit, and boom, it looks like the film print.
Or you could slap a film-look LUT on, but you don't need to go that far.
Excellent article really enjoyed it.
Yeah, this is the kind of thing that makes me really enjoy the internet.
What an excellent piece! I thoroughly enjoyed reading it, brought my childhood memories flooding back. I have so many fond recollections of that 90s era, including "A Bug's Life." I remember gathering with my cousins at my grandmother's house to watch these films on VHS. Time flies.
It reminds me also of the 24 FPS discussion, which is still the standard as far as I know for cinema, even though 48 or 60 FPS are pretty standard for series, The 24 FPS give it a more cinematic feeling.
https://www.vulture.com/2019/07/motion-smoothing-is-ruining-... https://www.filmindependent.org/blog/hacking-film-24-frames-...
To add, when it comes to video games sometimes people go "baww but 24 fps is enough for film". However, pause a film and you'll see a lot of smearing, not unlike cartoon animation frames I suppose, but in a video game every frame is discrete so low framerate becomes visually a lot more apparent.
I think it was The Hobbit that had a 60 fps version, and people just... weren't having it. It's technologically superior I'm sure (as would higher frame rates be), but it just becomes too "real" then. IIRC they also had to really update their make-up game because on higher frame rates and / or resolutions people can see everything.
Mind you, watching older TV shows nowadays is interesting; I think they were able to scan the original film for e.g. the X Files and make a HD or 4K version of it, and unlike back in the day, nowadays you can make out all the fine details of the actor's skin and the like. Part high definition, part watching it on a 4K screen instead of a CRT TV.
It’s fascinating to me how many of these discussions boil down to dialing in dynamic range for the medium in question.
As the Aladdin still shows with its wildly altered colors clearly other aspects matter/are at play. But the analog/digital discussions always seem, at least to me, to hinge heavily on DR. It’s just so interesting to me.
Many of us remember the leap from SD->HD. Many of us also can point out how 4K is nice and even noticeably better than FHD, but man…getting a 4K OLED TV with (and this is the important part) nice DR was borderline another SD->HD jump to me. Especially with video games and older films shot and displayed on film stock from start to finish. The difference is incredibly striking.
Don't tell me what I remember.
I remember it grainier and with occasional 'VHS zebra stripes'(?) too, though.
If you're interested in these 35mm film scans, I recommend watching this excellent YouTube video "SE7EN & How 35mm Scans Lie to You" https://www.youtube.com/watch?v=uQwQRFLFDd8 for some more background on how this works, and especially how these comparisons can sometimes be misleading and prey on your nostalgia a bit.
If you're interested in making digital footage look exactly like film in every possible way, I'll shill our product Filmbox: https://videovillage.com/filmbox/
Now there is the problem where many of my friends will take one look at a movie I started on the TV and say "ew, I don't like this movie, it's old" They don't realize they feel that way, viscerally, is because it's shot on film. How do I get people to watch film movies with me? They are far better anyway on average than many modern movies (from a storytelling, moviemaking pov, to say nothing about the picture quality).
Make them into a drinking game. We watched The Princess Bride the other day (never watched it), I think it's aged pretty well but then I'm old. But if they get bored, make it a game to have a drink or get a point or whatever for every sexual innuendo, lol.
Some films didn't age well though.
And finally, accept it and move on, ultimately it's their loss.
Show them a Tarantino movie
Happy to have set my television to less than half saturation
>Computer chips were not fast enough, nor disks large enough, nor compression sophisticated enough to display even 30 minutes of standard-definition motion pictures.
This is not true at all. Being compatible with outdated, film based projectors was much more important for being able to show it in as many theaters as possible. If they wanted to do a digital screening it would have been technologically possible.
I bumped on this too, since 1994-1995 was about the time when multi-gigabyte hard drives were readily available and multiple full motion video codecs were being used in games, albeit for cut scenes. Theater projector compatibility makes complete sense.
In 1994-1995, all the pieces for digital cinema were there, but they weren't integrated, and there were no installed projectors. The Phantom Menance was shown digitally.... on two screens. By the end of 2000, there were 31 digital cinema screens in theaters.
Digital cinema went with Motion JPEG2000 with high quality settings, which leads to very large files, but also much better fidelity than likely with a contemporary video codec.
https://en.wikipedia.org/wiki/Digital_cinema
> In 1994-1995, all the pieces for digital cinema were there, but they weren't integrated, and there were no installed projectors.
I agree with that. The article's quote from Pixar's "Making The Cut at Pixar" book was that the technology wasn't there (computer chips fast enough, storage media large enough, compression sophisticated enough) and I--along with the comment I replied to--disagree with that conclusion.
In period I was somewhat in charge of the render queue at a small animation company. I had to get rendered images onto tape, as in Sony Digibeta or better. Before that I had to use film.
We had an incredible amount of fancy toys with no expense spared, including those SGI Onyx Infinite Reality boxes with the specialist video break out boards that did digital video or analogue with genloc. Disks were 2Gb SCSI and you needed a stack of them in RAID formations to play video. This wasn't even HD, it was 720 x 576 interlaced PAL.
We also had to work within a larger post production process, which was aggressively analogue at the time with engineers and others allergic to digital. This meant tapes.
Note that a lot of this was bad for tape machines. These cost £40k upwards and advancing the tape by one frame to record it, then back again to reposition the tape for the next frame, for hours on end, that was a sure way to reck a tape machine, so we just hired them.
Regarding 35mm film, I also babysat the telecine machines where the film bounces up and down on the sprockets, so the picture is never entirely stable. These practical realities of film just had to be worked with.
The other fun aspect was moving the product around. This meant hopping on a train, plane or bicycle to get tapes to where they needed to be. There was none of this uploading malarkey although you could book satellite time and beam your video across continents that way, which happened.
Elsewhere in broadcasting, there was some progress with glorified digital video recorders. These were used in the gallery and contained the programming that was coming up soon. These things had quite a lot of compression and their own babysitting demands. Windows NT was typically part of the problem.
It was an extremely exciting time to be working in tech but we were a long way off being able to stream anything like cinema resolution at the time, even with the most expensive tech of the era.
Pixar and a few other studios had money and bodies to throw at problems, however, there were definitely constraints at the time. The technical constraints are easy to understand but the cultural constraints, such as engineers allergic to anything digital, are hard to imagine today.
Those comparisons were strangely jarring. It's odd to see (on the internet awash with "Mandela Effect" joke conspiracies) direct photo/video evidence that things we remember from our childhood have indeed been changed; sometimes for the worse!
I just showed Toy Story to my kids. It looked really bad. Mostly textures and lighting.
I wonder if artificial grain would actually make it look better.
Like when the game Splinter Cell was released, there weee two additional ‘views’ simulating infrared and thermal cameras. Those had heavy noise added to them and felt so real compared to the main view.
Pixar did re-renders of Toy Story.
https://www.youtube.com/watch?v=6w4bzm6ewRQ
Interesting, I think the film versions feel like they have more gravitas, especially the Lion king and Mulan scenes.
I find a lot of the stuff I remember from decades ago looks worse now. Toy Story in particular I watched when I got a projector after I'd seen Toy Story 4 and it looked bad, almost to the point I wish I hadn't tarnished my memory of it. Similar things have happened with N64 games that I cherished when I was little.
I don't buy that it's a real degradation due to different presentation methods. I'm sorry, but no matter what film stock you lovingly transfer Toy Story to, it's never going to look like it does in your memory. Same with CRTs. Sure, it's a different look, but my memory still looks better.
It's like our memories get automatically upgraded when we see newer stuff. It's jarring to go back and realise it didn't actually look like that in the 90s. I think this is just the unfortunate truth of CGI. So far it hasn't reached the point of producing something timeless. I can watch a real film from the 80s and it will look just as "good" as one from today. Of course the colours will be different depending on the transfer, but what are we hoping for? To get the exact colours the director saw in his mind's eye? That kind of thing has never really interested me.
> Same with CRTs. Sure, it's a different look, but my memory still looks better.
I don’t have this issue and never have. For whatever reason I’ve never “upgraded” them in my mind, and they look today exactly as I remember them when played on period hardware.
The changes in the Aladdin and Lion King stills surely can’t be accidental side effects? The Aladdin shot especially looks like they deliberately shifted it to a different time of day. Could there have been a continuity reason?
wtf happened to Simpsons on Disney+? looks like it's zoomed in.
There's an option to switch back to the original 4:3 ratio.
The simpsons was originally made in 4:3. Many people don't like watching with large black bars to the right and left, so they show a cropped 16:9 version. People complained because this is occasionally a problem and ruins a joke, so I believe you can opt into either.
A similar crime against taste as the pan-and-scan "fullscreen" DVDs of the early 2000s. If I want to pay to watch something, don't crop out a chunk of what the cinematographer wanted me to see...
David Simon talked about this for the HD release of The Wire:
https://davidsimon.com/the-wire-hd-with-videos/
It seems like the video examples are unfortunately now unavailable, but the discussion is still interesting and it's neat to see the creative trade-offs and constraints in the process. I think those nuances help evoke generosity in how one approaches re-releases or other versions or cuts of a piece of media.
There's a (much less severe) instance of that peeve with computer video player apps that have slightly rounded corners on the windows.
Pan and scan wasn't a DVD innovation. Most VHS releases were pan and scan too; DVDs at least commonly had widescreen available (many early discs came with widescreen on one side and full screen on the other... good luck guessing if widescreen on the hub indicates the side you're reading is widescren or if the otherside is widescreen so you should have the widescreen label facing up in your player.
I believe this is the best example of the problems that can be caused:
https://x.com/TristanACooper/status/1194298167824650240
Open both images and compare. The visual joke is completely ruined with the cropping.
Wow. Based on those comparisons they really do feel completely different. Really remarkable how such relatively simple changes in lighting and whatnot can drastically change the mood.
And here I was thinking of re-watching some old Disney/Pixar movies soon :(
> These companies, ultimately, decide how Toy Story looks today.
LOL, what? Anyone with a Blu-Ray rip file and FFmpeg can decide how it looks to them.
And how many people will have that? Eventually they'll just go "eat ze bug" and you'll have to eat shit they give you.
TL;DR: Linking to YouTube trailer scans as comparisons for colour is misleading and not accurate.
---
> see the 35 mm trailer for reference
The article makes heavy use of referring to scans of trailers to show what colours, grain, sharpness, etc. looked like. This is quite problematic, because you are replying on a scan done by someone on the Internet to accurately depict what something looked like in a commercial cinema. Now, I am not a colour scientist (far from it!), but I am a motion picture film hobbyist and so can speak a bit about some of the potential issues.
When projected in a movie theatre, light is generated by a short-arc xenon lamp. This has a very particular output light spectrum, and the entire movie process is calibrated and designed to work with this. The reflectors (mirrors) in the lamphouse are tuned to it, the films are colour graded for it, and then the film recorders (cameras) are calibrated knowing that this will be how it is shown.
When a film is scanned, it is not lit by a xenon short-arc lamp, instead various other illumination methods are used depending on the scanner. CRTs and LEDs are common. Commercial scanners are, on the whole, designed to scan negative film. It's where the money is - and so they are setup to work with that, which is very different to positive movie release film stock. Scanners therefore have different profiles to try and capture the different film stocks, but in general, today's workflow involves scanning something in, and then colour correcting post-scan, to meet an artist's expectations/desires.
Scanning and accurately capturing what is on a piece of film is something that is really quite challenging, and not something that any commercial scanner today does, or claims to do.
The YouTube channels referenced are FT Depot, and 35mm Movie Trailers Scans. FT Depot uses a Lasergraphics 6.5K HDR scanner, which is a quite high end one today. It does have profiles for individual film stocks, so you can set that and then get a good scan, but even the sales brochure of it says:
> Many common negative film types are carefully characterized at Lasergraphics to allow our scanning software to compensate for variation. The result is more accurate color reproduction and less time spent color grading.
Note that it says that less time is spent colour grading - it is still not expected that it will accurately capture exactly what was on the film. It also specifies negative, I don't know whether it has positive stock profiles as I am not lucky enough to have worked with one - for this, I will assume it does.
The "scanner" used by 35mm Movie Trailers Scans is a DIY, homemade film scanner that (I think, at least the last time I spoke to them) uses an IMX-183 sensor. They have both a colour sensor and a monochrome sensor, I am not sure what was used to capture the scans linked in the video. Regardless of what was used, in such a scanner that doesn't have the benefit of film stock profiles, etc. there is no way to create a scan that accurately captures what was on the film, without some serious calibration and processing which isn't being done here. At best, you can make a scan, and then manually adjust it by eye afterwards to what you think looks good, or what you think the film looks like, but without doing this on a colour calibrated display with the original projected side-by-side for reference, this is not going to be that close to what it actually looked like.
Now, I don't want to come off as bashing a DIY scanner - I have made one too, and they are great! I love seeing the scans from them, especially old adverts, logos, snipes, etc. that aren't available anywhere else. But, it is not controversial at all to say that this is not colour calibrated in any way, and in no way reflects what one actually saw in a cinema when that trailer was projected.
All this is to say that statements like the following in the article are pretty misleading - as the differences may not be attributable to the direct-digital-release process at all, and could just be that a camera white balance was set wrong, or some post processing to what "looked good" came out different to the original:
> At times, especially in the colors, they’re almost unrecognizable
> Compared to the theatrical release, the look had changed. It was sharp and grainless, and the colors were kind of different
I don't disagree with the premise of the article - recording an image to film, and then scanning it in for a release _will_ result in a different look to doing a direct-digital workflow. That's why major Hollywood films spend money recording and scanning film to get the "film look" (although that's another can of worms!). It's just not an accurate comparison to put two images side by side, when one is of a trailer scan of unknown accuracy.
Damn. I wish we could get the release of the 35mm colors in the way they look in the comparisons. The Aladdin one specifically looks so good! It makes me feel like we're missing out on so much from the era it was released.