It's had a huge revival the past two years. I got into it about seven years ago when it was still niche/"who shoots film anymore? Just get a sony a7!". Now the photography community has had a huge backlash against digital, especially in the hobby space.
Now it's very fashionable. Pretty ironic since the biggest destination for it is just Instagram, where it's a pixelated square but alas. The plus side is that the revived interest has led to more companies and people making and selling cool film stuff. New film types, accessories, labs opening. The biggest downside is that prices go up to match demand. Good film cameras are no longer being produced. There's a limited supply out there. And Kodak (the best color film producer, nothing beats Ektar 100) price increases are only matched by price increases at your lab to develop and get prints.
Film photography is an amazing hobby. It's a really fun artistic outlet that just about anybody can learn a lot about themselves with. You shoot for a year or two an you'll learn you have a style you like taking most. Maybe black and white architecture. Maybe golden hour landscapes. Rarely do other hobbies make you learn about your tastes as such.
I'm not very good. Maybe partially lack of experience. I think I mostly have some 400TX and Fomapan. I think I have some Fuji 400 something for color. I have some dry developer too. Probably have to use it up fairly soon before it gets too old. I develop the negatives, but then scan them using a photo scanner.
It doesn't hurt actually. During major incidents in my non-remote roles I was expected to be in the office and available for the duration of the active incident, even if I wasn't able to actively contribute (contrary to what folks may have seen on NCIS, having two people typing on the same keyboard is not actually helpful when fighting hackers :P )
As a remote worker I can be at home and present with my family, with short breaks for actual activity, and longer periods for active response. This is not speculation - I have been active on incident response in the last month while helping my kids with homework, side by side at my home office desk).
I would not trust a surgeon to operate on me unless they have over 65 years of experience. I want them to have operated on patients since 'Nam. You went to a state medical school? Pfft. Go kill some other patient than me.
In all seriousness, your point brings up the idea of where does the responsibility for this immensely difficult task (securing networks) fall? If we could spread out the "required" 15 years of experience into each of the developers, would that have the same effect? Building software with security baked in would reduce the need for so much work after the fact.
General security awareness training in CS programs (not the 'don't get phished' type of security awareness) would certainly go a long way, in my opinion. Security being taught as a fundamental necessity of programming would, down the road, lessen the load everywhere else.
But there is also a fundamental disconnect between what schools are teaching and what industry is hiring for. The answer right now is "Go to school for cybersec, get your certs, then work for X years as a low-level help desk agent or call-center phone jockey".
Industry needs to tell educational institutions what candidates get from being a password-resetter that isn't taught in school, and work with those institutions to get those skills into the curriculum.
I have a lot more to say on the topic of cybersecurity and hiring, but I'm getting into rant territory.
Edit to add: You mentioned 'spreading out the 15 years of required experience'. I firmly do not believe it takes anywhere near 15 years of experience to become competent at cybersec.
That safety net is part of education. Schools keep children off the streets for 7 hours a day, offer reduced lunch. Investing in a social safety net is code for investing in public education.