Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've worked with two types of older engineers:

1) Ones who keep up their skills

2) Ones who don't

The former are a treasure trove of knowledge and skills, and provide substantially more value than anyone junior ever could. Going through so many computing eras gives a higher-level way of thinking about abstraction, or understanding computer architectures. They've hand-tweaked assembly, C, Java, and when they're now doing JavaScript or Python, they understand all the layers of metal underneath. They've gone through flow charts, structured, functional, object oriented, and all the variants there-of. They've written high-speed algorithms to draw lines with pixels, to ray trace, and are now coding GPGPUs.

The latter are liabilities, bringing in 1980-era best-practices. They're working on some legacy BASIC or COBAL system from the seventies, and surprised they can't find a new job when that's upgraded and they're downsized.

I've rarely seen #1 and #2 mix. They're very different crowds, and in very different types of companies.

My own experience is that I need to devote about 20% of my time to keep up, if I'm doing it on an ongoing basis, or about 1-3 months every 2-5 years if I do occasional deep dives. Basically, I dive headlong into whatever is the newest, trendiest stack, and get a product out using that, deeply learning all the deep things behind it too. That's what works for me. YMMV.



>My own experience is that I need to devote about 20% of my time to keep up, if I'm doing it on an ongoing basis, or about 1-3 months every 2-5 years

And that's the problem with software engineering vs other white collar careers. For example, my accountant friend is expected to be trained by their employer in the latest accounting practices and law frameworks and does't devote 1-3 months per year of their personal time on open source accounting projects to learn the latest legal framework for fun, that would be crazy for him. Same for my friends in architecture, dentistry and law. Their employers pay them to learn and gather the expertise needed for their future in the firm.

Whereas, as a software engineer, very few companies(at least in Germany from my experience) will invest into their existing workforce to train them on the job for the future language/framework they will plan to use and instead seek to let them go once their expertise is no longer valuable and hire someone already experienced in the needed stack then repeat the cycle several years/decades down the road.

That's why here you're expected to transition to management as a career progression, as IC roles are not really valued at old age unless you've dedicated your free time to coding and I don't know about you guys, but I'd prefer to spend my free time with my kids and exercising outdoors instead of coding to make myself employable in the latest stack.


As a physician, I attend conferences, subscribe to online references, question banks, various journals, take ongoing CME, repeat licensing exams, and spend the equivalent of one workday a week reading those new materials, and try to spend a couple hours refreshing myself on materials outside of my specialty. This amounts to an extra un-reimbursed workday a week, and several thousand dollars a year.

When I worked in a place that offered CME/conference reimbursement, it covered about 1-2K a year, depending on budgeting issues. In my current place, and for all independent or small practice physicians, that comes out of your own pocket.

This does not have any career benefit whatsoever; it’s done so as to be worthy of our patients’ trust.

I wouldn’t mind if it was at least partially reimbursed though. It’s an enormous chunk of change, and not for my benefit.


That's quite a workload you've set for yourself. I honestly don't know how my doctors do even the basic stuff I see them do, like seeing patients and charting. When you have to see 4+ patients in an hour (this is too many!), it would seem to me that charting would be one of the things that ends up going out the window.

I also find it interesting that you go so far as to repeat your licensing exams. Is this common among physicians as a whole? Having known a couple of med students personally, these exams were usually seen as a hurdle to be overcome and a source of stress, but, I suppose it might get easier after a few years of practice. On a related note, I find it hard to imagine that, say, lawyers would routinely re-sit the bar exam for funsies.

Regarding CME, isn't that required to maintain licensure? Or, are you talking about courses above and beyond the minimum to keep your license?

And, BTW, I don't know who you are, where you practice, or even what your specialty is, but you sound like the kind of person I'd like to have be my doctor.


Interesting how people interpret things so differently. A doctor who says that keeping abreast of their field has no benefit to them sounds like the kind of person I'd not want as my doctor.


I agree, but I also know how chronically overworked doctors are. That gives me a bit of sympathy toward the ones who don't want to basically work an extra day a week just to avoid falling behind.


> This does not have any career benefit whatsoever; it’s done so as to be worthy of our patients’ trust.

"I do it for my patients, not for my paycheck." If you have a problem with that, then by all means, I'm sure you can find another physician that would suit you better.


They specified no career benefit. I'm almost certain they view increased trust from their patients as a benefit, and the extra confidence you gain from keeping up with the latest clinical science is hard to measure in terms of personal value but almost certainly comes out to $0 in financial terms (or negative if you value your free time).


> Is this common among physicians as a whole?

Some of it is, some of it isn't. The worst doctor you know (okay, maybe not the worst, but close) is doing at least a couple of major journals and his CME. Really, you'd be surprised, but sitting on the other side of the exam table, believe me - it's the exception that doesn't try to stay fresh. That's really not what distinguishes bad from good from great doctors - it's finding a way to integrate and retain all that knowledge so you can apply it in an unexpected clinical scenario, rather than as watercooler talk or on an exam.

I don't re-sit step exams 1/2/3, but I do a lot of ongoing question banks to refresh my boards, and I do go back to refreshing material from step 1/2/3 all the time (which is what I meant about repeating licensing exams- I see now that phrasing was unclear.) You're right that it's largely a hurdle and a stress, but that's because as a med student you're drinking from a firehose and your career depends on it. Studying it at my leisure, I can dive into things as deep or as superficially as is interesting at the time, and the broader my knowledge gets the more insights I ultimately glean from going back to those fundamentals. Memorizing biochem pathways when studying for boards is hell; refreshing biochem at your leisure just to better understand and retain is... well, if not pleasant, it's certainly not hell.

CME is required to keep your license, but that's not a problem that I have - there are multiple sources of materials that grant CME credits that I already do "for fun", so I've got an over-abundance of credits. I hunt-and-seek interesting CME courses to stay abreast of interesting things. I went into medicine for love of medicine, and the idea of getting so narrow into my niche that I lose sight of all of those other exciting things would be a tragedy to me.

As much as I appreciate the praise, honestly, you'd be surprised by how much even your least-impressive physician puts into staying up to date. There's just so much to know that the moment you stop your knowledge base evaporates.


> This does not have any career benefit whatsoever; it’s done so as to be worthy of our patients’ trust.

Keeping abreast of changes in the field and state of the art offer no benefits and is done solely for patients' trust?


Goodhart's Law

May be what OP is referring to.

I have seen it in teaching. Teachers need xx hours of training per year. Training often satisfies that requirement but provides no significant benefits in terms of pedagogical improvement or content knowledge.

Teachers that want to improve do so by other means. The training keeps us in compliance.


I've known more than a few high school teachers who ended up with Masters or PhD degrees kind of by default via continuing education courses. That would seem to go against your "teachers that want to improve do so by other means" idea, unless I'm confused about the nature of continuing education requirements for teachers.


My assertion is that continuing education credits, or advanced degrees are far from a guarantee of improving a teacher's practice. Continuing education suffers from "box checking." There are a number of reasons for this.

It is in no way a knock on teachers. They are caught up in a bad system and are responding to systemic incentives.

If we apply an always/sometimes/never framework to my assertion, we can find examples where teachers advanced their practice via continuing education. So the teachers you know certainly could have advanced degrees, some even very helpful in improving their practice.

My experience in K-12 as well as studying the history of education reform in America since Sputnik was launched inform this assertion. It has been a recurring theme for 60 years.


Credentialism - I saw this word used today elsewhere and it sums up what I am trying to say


Correct.

We don't gain or lose patients by it; the most recent changes in the field are often so far from settled clinical practice that they're years from anything that would be considered malpractice; we don't get reimbursed better or for it; patients largely can't tell the difference, so it doesn't change your referral stream.

It does little-to-nothing for our careers. We stay up out of pride, and out of commitment for providing our patients with good care.


Medical literature has a pretty low SNR when you look at it from the point of view of "does this help my patients?" Also, PubMed is a thing that's roughly the medical equivalent of Stack Overflow, so you can do some of this "on the fly" to an extent.


They said no career benefits. They don't usually get a promotion for knowing about X or a salary raise so its effects are indirect if they don't actually use it regularly like say a dentist knowing about dental implant options.


That is a fair point and in that light the statement is far less jarring.


> This does not have any career benefit whatsoever; it’s done so as to be worthy of our patients’ trust.

That's what I said. It's surprising how many people chose to read into my post something that isn't there.


Does this surprise you? Interacting with my doctors has never given me the impression that they stay abreast of current literature, and their employment doesn't seem threatened by the deficit.


In medicine it's largely to college bribes from vendors.


Doctors around here don't do anything close and pay out of pocket. Most will take sponsored vacations paided for by drug companies for being the top subscription giver.

Same for dentists usually paided for by some whiting product they will push over the next year.


How would you compare your earning potential to a software developer's?


Doctors on average make more, but they also start their careers burdened with overwhelming debt (a few hundred thousand dollars) and work much longer hours than software developers.

Both my parents are doctors, and I'm a software developer. And you know what? I have it really, really good in comparison.

edit: I noticed that you phrased the question as "earnings potential" -- well, in that case, it's comparable. High-level engineers at FAANG make boatloads of money.


Good points. Doctors do appear to have good career longevity though (in the US). The practice my children go to has several doctors in their late sixties. The doctor who delivered my kids was close to seventy years old. So if you look at career earnings, doctors seem to be in a better spot, as they aren't worried about finding a job from ages 50-65. Is that your parents' experience with their colleagues?


In fairness the CME/conference circuit is basically a way for physicians to legally embezzle a free vacation. Many of those conferences are held at places like Jackson Hole, the Bahamas, etc, with relatively short amount of hours per day spent in talks and the rest skiing/relaxing/drinking etc on the hospital dime.


My anecdotal evidence does not match your anecdotal evidence. I'm approaching 50, and I've had about a dozen jobs in tech starting in my teens. I've never worked anywhere that didn't allow on-the-job time for learning, and the majority of my employers have both actively-encouraged it and financed it. I also learn on my own, for fun, as my career started as a hobby and still interests me, but the vast majority of my education has been paid for during normal work hours. I also actively seek out new and interesting technology when switching jobs, and I switch jobs when things stagnate. It's what you have to do in tech, regardless of your age. If you stay too long at a company that isn't advancing your career, you're going to go stale. This isn't specific to tech either. How many mechanics, lab technicians, chefs, marketing people, stock brokers, architects, lawyers, etc. could find a job today if they hadn't learned anything in 20 years?


> I've never worked anywhere that didn't allow on-the-job time for learning

I think the issue is more just that there aren't any clear ethical standards that have been set industrywide, and since developers tend to have limited oversight, at this point it's really just a matter of what standards you set for yourself.

IMHO for things you're learning that will materially benefit your career, a reasonable standard would be that for every hour you spend on your own time teaching yourself that thing, you can spend an hour of paid time. Whereas for things that only benefit your employer, e.g. niche libraries or outdated frameworks, that should happen entirely on the employers dime.


No field reinvents itself as frequently as software.


That's only true if if you consider SWE as "learning a JS framework".

The core methods and principles of SWE are almost timeless, other things are well documented and can be learned on the go.


It's extremely difficult to communicate this to a team exclusively populated by those who do not have cross-language experience.

As an anecdote, and not to boast: I have occasionally had to write data visualization and function plotting software at least a half a dozen times in as many languages and frameworks, from QuickBasic as a teenage hobby, to C in DOS, Java (Swing and JavaFX), OpenGL, WebGL, JS with charting libraries, Linux raw framebuffer... Also tweaked MRTG charts back in the day, wrote a super basic 3D editor in highschool for games I never ended up making, etc.

A team I was on once had to add a simple chart to a webapp. When the task came up in meetings and was causing the dev assigned the task some grief, I mentioned that I've had to do some charting work before, and offered to help with any details if they got stuck. Instead of saying something like, "Ok, I'll let you know if I have any questions," they said, "Yeah, but was it D3?"


If by "anything" in the phrase "hadn't learned anything in 20 years," you mean things like legal principles, rather than specific new precedents and laws, I suspect a lawyer probably doesn't have to learn much after they clear the bar exam. If any lawyers read this, I'd love to be proven wrong. :)


I think you're underestimating how fast the law changes. In a lot of practice areas (I work in tax), there can be substantial changes every few years, and in between, daily work changes as everyone converges on optimal strategies, and then the law changes again. Then there's new software constantly, which you generally need to be familiar with or get left behind. Then there are the gimmicks of the day that your clients find on the internet that you need to be familiar with or risk looking like you don't keep up with best practices (even if the new stuff isn't anywhere near a best practice, or is only applicable to Fortune 500 cos.).

I would go so far as to say that a lawyer who just recently passed the bar exam is substantially worse at the practice of law than a paralegal with 20 years of experience. The legal principles learned in order to pass the bar exam are akin to... basic algorithms (maybe?) for a software engineer. They're important, but they're also not really what the job is on a daily basis.


Nobody can argue what the next accounting rules are. Accounting practices are handed down from above and all accountants have to follow it.

Programming is not like that. If you ask 100 programmers what the new big thing is, you will get 100 different answers. I would much rather take responsibility for my own professional education than outsource it to a company that may not have my career best interests at heart.


>Nobody can argue what the next accounting rules are

This is just not accurate.

Accounting rules not only can be argued, they are all the time.

FASB and GAAP are rife with subjectivity.

I have an accounting degree.


True but the framework is standard, which is why there is no software engineering equivalent to GAAP. GASP?


I think a lot of this is because employee turnover in the software industry is much higher than for most professions.

An accountant, lawyer or architect can reasonably be expected to stay with the same firm for a decade or longer, often their entire career. It makes sense under that context for employers to invest more in long-term skills.

Whereas, once you get to the heart of Silicon Valley, it's not unusual for people to jump employers every 6 months. That's maybe not the rule. But even Google and Microsoft has turnover rates that imply a half-life of no more than a few years for the average employee. The economics of long-term re-training just doesn't make sense.

Is this the worse thing in the world though? It allows savvy workers to continuously jump around companies and continuously re-negotiating higher compensation packages. That helps to make sure that workers are paid at or near their market value. In a way that doesn't work in the accounting industry, because future employers would look down at your resume history.


I managed a software development team, staffed with a number of folks that each had about 30 years experience. We were a fairly advanced team that wrote image processing pipelines in C++.

When they finally rolled up our team, after 27 years, the person with the least seniority had ten years.

It's entirely possible to keep folks for long periods of time, but it requires some pretty serious management chops, and those skills are not exactly encouraged, in today's world; in any field -not just software development.

I worked for a Japanese company. The Japanese wouldn't even acknowledge my employees until they'd been around for a year. Having the same people on hand, year after year, was pretty much required, as our projects always took the long view.

I can't even imagine having an environment where engineers are expected to quit every year and a half.


No matter how good the manager is, if he is beholden to an HR department that only gives slightly above COL raises while the market explodes, people will jump ship.


Says you.

Our HR was run by the Corporate General Counsel. It was pretty much as bad as you can get.

Also, we were paid "competitive" wages (read: below-market).

I was the manager for 25 years (I should mention that the engineer with the most seniority had 27 years -we started together), and I feel I did a pretty good job. Considering they did not "jump ship," I guess something went right, eh?

Nowadays, it seems that people "manage by fad," as opposed to managing humans. It's kind of heartbreaking, really.


Until they find out that while they were getting 3% raises and their salaries were well behind the developers who cane in after them that were paid at the market rate.


Wow. You really want to discount what happened. Why?

Just assume that we were all idiots, and go back to your happy place.


Yes. Anyone who is willing to work for less than market value while new people are coming in at market values aren’t “idiots” but they are unwise.

Did you tell them that “we are all family here”?


Did it ever cross your mind that there are more things than money which make a person stay at a company for a longer period?

For me personally, if the money is enough for me to make a comfortable life with my family, other things at work become important. I could never spend most of my days with assholes or do idiotic work even if the pay was way above market value, for instance. I'd also trade money for more free-time, if possible.

And yes, the company I work at does feel like a family. But nobody had to tell me this. It just does so, naturally.


Well, I'm done engaging. It's fairly clear that we have incompatible worldviews, which is sad, because I'll bet we could find many things in common.

I sincerely wish you the best.


That logic doesn't hold up. The overhead in learning your way around a Google or Microsoft-sized codebase is much larger than learning a new language or framework.


That's no surprise. Both of those companies have many individual codebases that are larger than any web framework. For instance, last I checked, Django clocked in at around 60-70k [0] Python LOC. The Windows kernel source, IIRC, is over 1M LOC, and, obviously, much lower level than Django.

---

[0]: Also, IIRC, 20-30k of that is taken up by the ORM.


This is a good comment with a lot to unpack, but I want to raise a couple points here.

First, I wonder if firms investing in training could possibly improve turnover, thereby creating a bit of a positive feedback cycle. It doesn't even have to be formal training, either. It could be something as simple as having a weekly journal club, or the equivalent, and encouraging engineers to read at least one research paper a month. [0]

The second aspect, engineers moving jobs just to get raises, seems weird to me from a market efficiency point of view. Interviewing costs companies money -- so much so that it's something they should want to do as little as possible.

Many companies don't keep pace with the open market in terms of raises, which is a primary force driving people to job hop. Are there any studies comparing companies that do at least attempt to keep comp for current employees in line with the open market against those who don't?

---

[0]: In my experience, reading research papers thoroughly can be a pretty thought intensive process. In grad school, where I studied math, what I would do is read the abstract, decide if it was interesting, then skim the section headings and statements of theorems to see if I wanted to go further. If I did, and I was searching for a particular widget I needed for a proof, then I would read as much as I needed to read to digest the proofs of the useful theorems. If it was for general interest, then I would read the whole thing. I found that once I got an interesting paper in my hands, fully comprehending what it said could take up to 1 day per page for particularly dense papers.


> Whereas, once you get to the heart of Silicon Valley, it's not unusual for people to jump employers every 6 months.

Unless you're serially hopping from one shitty startup to another, it's pretty unusual.


> because employee turnover in the software industry is much higher than for most professions

I wonder if that's actually true, and if so, to which degree. On what basis are you saying this?


> Their employers pay them to learn and gather the expertise needed for their future in the firm.

I think it very much depends on the company and culture. I always had jobs where learning and self improvement was encouraged and expected (also in Germany). With a budget for conferences and books and a fixed time frame (~20%) for that. These were all companies that primarily did software development -- either direct product development or project work for customers. On the other hand you have firms where software treated as an appendage. They might have other great products but an entirely different managerial background. A mindset like: "We need a software department, everyone else has one too" can easily lead to mismanagement -- and I think a lack of time and budget for self improvement is an aspect of mismanagement in the business of creating software.


Ah, training - I remember that. Sadly I was only ever "treated" to a couple of "real" (paid-for) courses at the start of my career and the rest was on-the-job.


I've also heard of a lot of employers that are pushing the 20% policy for their software engineers or providing some other time allocated towards upskilling.

My employer requires us to do one hour of training per day. We are allowed to study whatever we want or work on personal projects. I personally don't think it's weird for an engineer to keep up with emerging tech and trends. I'm sure a lot of engineers outside of software do this.


I have a slightly different problem. My employer will pay to help us keep our skills up. We can take training classes and go to 1 related conference per year. We have some time during regular development to work on innovative projects that we'd like to implement or experiment with.

That's all great, but much of it ends up not being applicable. For example, we were trained in a new language about 3 years ago, but we haven't been allowed to use it in our product yet! I've been doing my home projects in it, so I'm ready to go when we do start using it. But most of my coworkers took the class and haven't touched it since. They've likely forgotten everything about it. Likewise, the conferences are nice, but I've never implemented anything useful after having read a paper about it, or seen a presentation on it. (The few times I've tried, it turned out the paper didn't give enough information to do your own implementation!) It does keep me aware of what's going on in the field, but I'm not sure how useful it actually is to my job.


You definitely have to push for it, only in the best of companies will everyone from top to bottom management take case of this proactively. Speak to your manager regularly that you need to expand your skills, tell him why, and show him what you want to do and how he can help. Some managers are not very good at this and you need to do the majority of the work. Accept it and do your part, but don't think training isn't needed just because your manager doesn't bring it up.


> Whereas, as a software engineer, very few companies(at least in Germany from my experience) will invest into their existing workforce to train them on the job

Maybe I've been fortunate, but in the UK and for a couple of years in Australia, I have had employers (and later clients for my contracting business) who have been happy to throw me at projects far enough outside my comfort zone that I keep learning and stretching my muscles. I feel at the top of my game (much of the time).


> ... (in Germany) ... instead seek to let them go once their expertise is no longer valuable and hire someone already experienced in the needed stack then repeat

Interesting. I thought Germany's labor law highly discourages this. Isn't "It is easier to divorce than to fire someone." a German saying for your tough labor law?


Note that these other professions have tests and certifications you need to pass. I can see developers howling in anger if they were required to do this.


I don't think it makes sense to have this in the form of required licensure to practice, but I certainly wouldn't mind if there were tests and certifications I could take that would allow me to show prospective employers what I could do. Extra bonus points if having those things on my resume allowed me to avoid the types of interviews where the candidate is essentially put on the spot for an entire day in front of a whiteboard or laptop. To their credit, I believe that's a direction TripleByte might be attempting to go in, but I don't pretend to be able to speak for them.


How often do legal frameworks change?

As a non-expert, I think there's a 50% chance it costs 10x more to teach each new hire for a month.


> They're working on some legacy BASIC or COBAL system from the seventies, and surprised they can't find a new job when that's upgraded and they're downsized.

Wow, exaggerate much?

This doesn't describe the vast majority of "older" workers, it's just another disappointing stereotype and expression of ageism.


The problem is that there is an undue burden put on those in group 1 to prove they are not group 2.

Imagine an engineer walks into an interview. It's a young person, you think nothing of it and you go on with your normal interview. Now a different engineer walks in, and he has grey hair and some wrinkles. You feel the need to dig into whether they're in group 1 or group 2, in addition to your normal interview.

I'm not saying you're wrong, but if we were talking about how there are two groups of women and one of them is a liability a lot more people would be setting off alarms.


Given the process of interviewing I think that burden is on anybody who attempts to go through the process of interviewing to a degree, young or old, male, female, or non-binary no matter the color or creed. A normal interview should dig through the qualities of 1 and 2 - although in practice many are downright insane powertripping prejudices that a stranger couldn't possibly know like filtering based upon if they follow arcane, arbitrary and archaic fashion "rules", or if they write a thank you letter afterwards.


People do think that about women, but it usually goes “Is this the kind of family-centered woman that will be on perpetual maternity leave and not pull her weight?” It’s very hard to do anything about it even if we acknowledge that it happens.


>Ones who keep up their skills

To get a rough idea of the value of up-to-date skills vs apparent age, let's consider a hypothetical:

If both a 50 and 24 year-old graduate from the same coding boot-camp, do they have equal odds of being seen after the first interview? (let alone actual employment)

And how big is the difference in probabilities?

I also note that desktop games are primarily coded in C++/C. If hiring is skill based then we would expect that industry to be zealously recruiting older engineers.


Sadly from experience, HR will blindly screen them out and speil the usual "overqualified" excuse to not progress. That they regularly do and get away with.

Had situation exactly like that once and I bypassed HR straight to manager who lambasted HR, got interview and the job, still got shafted by HR who messed up salary, lied about rise and generally made my life hell with pettiness and bullying for want of another way of putting it. That is along with one person in HR taking my side and telling me what her manager was upto and next thing, that person was gone. So yeah - HR causes many of these ageist issues when it comes to the situation you outline.


What I've heard about game development (correct me if you disagree) is that it is typically higher demanding or lower paying than gigs in other fields. What I've heard is that it's only worth being in that industry if you're passionate about it. I think older engineers will trend towards less demanding or higher paying industries, and are less likely to have as serious of passions towards gaming


Boot camps aren't "up or date skills" but to the extent that 24 year olds are getting hired off them would still support your point.

Why do you assume young people don't learn C++ to work on games? They do to work on web servers.

The AAA in AAA games refers to media content, not the game engine. Game programers are a trivial fraction of the C/C++ industry.


24 year olds aren't getting Senior or Mid-Level Dev salaries. They know the new hotness and are cheap -- that's good enough.


> They're very different crowds, and in very different types of companies.

This is not a fact. I've seen both of your types in the same company. Let's not reduce people down to "you're either this, or this". It's not a good way of thinking. People are more complex than that and have different things to offer.


I am fascinated when the #2 group pops up to say there is age discrimination in coding - when you dig into it, the problem is about skills and not age (although I cannot really speak to ageism in the tech sector).

Every other professional career requires you to stay up-to-date in your professional skills and knowledge to stay relevant in your field. Why would coding/tech in general be any different?


Just because there are older folks struggling to find jobs after not keeping up skills doesn't mean there isn't also ageism, and it can become problematic when everyone disregards the ageism by saying it's all just meritocratic and skills based.

I think there are lots of younger developers in the hiring process who start out with a bias of assuming older developers have atrophied skills. Then when that bias makes it harder for older developers to find jobs, the younger developers say "ah, well that's meritocracy for you".


Beautifully said. Ageism is real.


This isn't true at all. Think of other professionals - lawyers, doctors, professors, etc.

A doctor specializes in one aspect - surgeon, anesthesiologist, ER, GP, psychiatry, etc. They might pivot once in their career, but most of them don't, and they're able to find employment as long as they're able and willing.

I know several lawyers, and most of them had to specialize by their early 30's if they ever want to make decent money - family law, real estate, employment, personal injury, whatever. Again, the older the lawyer, the more seniority they have and higher they can bill at most firms.

How many professors in academia do you know that have experience teaching in multiple schools, e.g. business, engineering, social sciences, etc? Not many, usually they have a very narrow niche.

The problem with the tech industry is mostly due to offshoring, the rapid (and pointless) pace of new frameworks and tech that's mostly due to shifting dominant players, and the naivety of most software engineers who've been unwilling or unable to organize and create some sort of protective barrier similar to every other industry (teachers or cop unions, AMA, legal bar, UAW, etc.).

And again, because this is HN, the majority of developers are not worried that they won't be making FAANG salaries with sweet equity and stock options into their 50's. They're worried that they'll be training their 25 year old replacements from Bangalore at the typical mega bank or insurance company, left with only sporadic temp gigs and 6 month contracts at half their salary and with 15 years left before Medicare kicks in.


You can do pretty well as a developer if all you know is eg Cobol or low level C.

Do keep in mind that the industry has exploded in size over the years.


I'm pushing 40 and primarily a Java developer. I'd be surprised if I can't ride this wave all the way to retirement.


Python seems to be headed this way, as well. People tend to forget that Python is almost 30 years old already. That it's held up this long, and that it's still being developed and maintained strongly suggests it will continue to be a viable language in the industry for many more years.


There is still plenty of new development going on in Java, and I hope that continues. But I'd be afraid that if Java is all you know, you're going to increasingly be stuck on critical legacy JEE / Spring apps at banks, insurance companies, etc. Right now that's okay - there's still a lot of innovation in these frameworks. But in 10-15 years, it might be the worst kind of gig left, stuck with offshored and contracting teams of the lowest bidder.


If Cobol is anything to go by, at least pay won't be much of an issue.


>Every other professional career requires you to stay up-to-date in your professional skills and knowledge to stay relevant in your field.

Is this really the case? My limited experience is that the amount of constant learning expected from a coder is an order of magnitude more than in most other fields.


I work in education - if I were to sit on my hands and use anything more than the most basic foundational research and 'best practice' from when I left my post-doc program 10 years ago, I would be unemployed. I work on professional development and skill building constantly.

That's not the exception to the rule, either. You're expected to stay current on trends in your field, and stay ahead of best practices.


#2 describes a trend in every sector and business, there are a lot of people that find it harder to get work as they get older because the value of a skill set/knowledge base evaporates. From coal mining to payroll.


Definitely agree - at my last employer, there were 3 of us in our 40s. Their skill set could be best summarised by "the year 2005": Subversion, MFC, and C++03.


I suspect that in 15 years people will be saying the same about K8s, possibly JS if Wasm takes off and I suspect ML will a fairly niche area.


Is "subversion" a skill (unless we're talking about spies ;)? It's just a tool. I've used it. I've used cvs, git and other stuff too. If I had for some reason to work in a company that uses some other tool to handle their code repos, I'd learn it in a month or so (less if it's organized logically). It's like saying "driving a Honda Civic" is a skill - driving is a skill, Honda Civic may be what you're driving right now, and next week you may need to drive a BMW, and being able to do both is what I'd call a skill.


So? Subversion works fine; nothing wrong with it. Are you now a bad developer if you focus on actually getting stuff done instead of spending time migrating to a new tool?


Considering the company is down to its last couple of developers and can't attract new ones because of the old technology stack, it's definitely a problem.


I guess, but those tools alone build powerful fast software, and those programmers are solving problems which translate well to any field requiring problem solving. The software world may change rapidly, but a React / Git stack is no different than a C++ / Subversion stack when both jobs require programmers to solve hard, complex problems.


I see a lot of 40/50 year olds doing Java/Python/C++ development with CICD unit testing skills that are handy programmers without knowing much about JS/ML/k8s. To me that puts them right inbetween your #1 and #2.


As a programmer in that age range I know I don't need to spend time learning the nuances of JS if I'm not using JS. If I start on a JS project (god forbid) I can learn that then. Just like I learned C/C++/Java/Python/PHP/Ruby/Clojure etc when I started on one of those projects.

Machine learning - we know that just means linear regression or Bayesian filters plus marketing, and we prefer programming. We've also seen 20 years of "magic bullet" solutions like ML fizzle and die in the real world and know most ML projects never see a day in production.

K8s is great if running k8s is your job. But it is a specialized skill that is only needed to run very large infrastructures, unless your project architecture (read microservices) has gone very wrong.

20/30 year olds think "keeping your skills" up means learning every new programming fad that blows through because they don't have the experience to differentiate the fads from the advancements. It is like telling master carpenters they need to keep switching brands of tools to "keep their skills up". But all these tools do the same things and are 99% identical. They are busy building stuff with the tools they have.


I love the carpentry example, it's perfect! Sometimes I think tech hiring is kind of like hiring a carpenter, and asking him, "Do you use a circular saw? Because we're a circular saw shop. And I don't mean your 1952 Black and Decker saw, I'm talking a modern day Makita. If you don't have modern Makita circular saw experience, you need to work on updating your skills!"


The older I get, the less interested I am in learning new things for the sake of it. I've already programmed with a whole bunch of different environments in the last 20 years; what's the value of learning something new if it doesn't give me any benefits?

It's not being lazy or a "dinosaur", it's just better time management.

The absolute worst devs I've worked with were very "up to date" people who wanted to re-do all the already-working "outdated" stuff to new "modern" standards. Often, it's just a waste of time.


Very much this. Knowing particular set of keywords that constitutes latest fashion language is a short-term skill. Knowing the paradigms that guide all of them and being able to learn the keywords if necessary in a short time is a long-term skill. Some companies prefer to "hire to the spec" to save short-term learning costs. Smarter companies look into long-term skills which will be useful whatever keywords are in fashion this season.


ML includes logistic regression these days too!

Revolutionary advance in industry best practices. /s


From your comment I get the feeling that you think Java/Python/C++ is something obscure legacy tech along the lines of Basic/Cobol.


It also implies that machine learning and devops is a norm for software development


Which is typically a symptom of having worked at a start-up vs. an established company.

During the dot-com crash, I went from working at a 40 person start-up to a 25,000 employee utility company, and it was a real eye opener. A lot of my "cutting edge" (for the time) skills were dismissed as being flash in the pan, and all the "real work" was done with tried and trued technologies. I ended up finding my way back to a start-up a few years later, and everything was reversed again.


That says "startup". ML is the buzzword that investors love, and k8s/devops allows avoiding big investment into infrastructure which may need to be dropped anyway when it turns out the market doesn't actually want yet another "apply ML to click stream to save on ads costs" startup (I'm stereotyping of course but you get the idea).


Makes me wonder how old previous commenter is too if they have that view.

Ageism is rampant in this industry.


It’s actually some what ironic considering how much SV pushes for diversity and inclusion.. unless you’re over 40.


I'm 42, and until recently worked for a well known tech company in San Francisco. Most of the time I didn't feel like I was the only person over 30 in the room, but after moving to another city and starting to work on a truly age-diverse did I realize how unbalanced my previous team was.

My current team has a good mix of industry experience and excitement for new technologies, which makes planning both effective and exciting.


Part of the problem is that SF is just so damn expensive... it's going to self-select for people that can afford to live without the additional burden of a family and that tends to be people <30.


Some parts of SV push for diversity and inclusion, but when push comes to shove, firms are quite happy to protect established power structures - shitty managers, retaliatory practices, toxic culture, etc.


Seems to me they place them (IMO appropriately) somewhere between legacy and trendy.


1) was "keeping up with their skills", not "trendy". "Keeping up with your skills" does not mean "know JS/ML/k8s" for a wide range of developers.


No, I am one of the people I just described. I dont know where I fit in with the OP. I'm struggling to keep up with the avalanche of new stuff coming along.


As a veteran dev in my 50's, "k8s" gives me the screaming heebie-jeebies. I've just about got my head around Docker. But it's painful seeing a system that should be a nice little monolith serving a few thousand requests an hour split up into microservices and "managed" using k8s for no good reason.

I realise this might make me unemployable in a modern web dev environment. Maybe I can just ride it out until the industry goes through the rest of the cycle and rediscovers simplicity.


k8s basics is pretty simple actually. If you know Docker, k8s basically is a way to keep a bunch of Docker containers running according to a bunch of YAML configs. There are all kinds of fine details, but the gist of the thing is just that. Of course as every tool it's not always used properly.


It's not the complexity that concerns me. I have dealt with more complex things ;)

It's why people use it in the first place. I get the need for it when you're dealing with huge scale. But it seems to be the new default deployment model, for services that really, and I mean really, don't need to scale that much.

And I've seen people justify breaking a nice monolith into microservices (usually badly) so they can deploy it easier using k8s. Which is totally putting the cart before the horse.


> It's why people use it in the first place.

Easier to run a small service in a predictable environment where nobody can step on your toes. Also pretty easy to adjust resource allocations, update pieces independently, easier to isolate screwups (1 part going down sometimes is better then whole thing going down), etc.

I mean, of course you can't approach the task as "we want to deploy on k8s first, no matter what" - of course you have to consider the task at hand and if it works better as monolith - keep the monolith (you can still use k8s - it's just a way to run code, any code can be run within it). But if the task suits the model - e.g. many data processing/transformation/ML workflows do - having a tool like k8s can make deployment easier. One doesn't have to make a religion out of it, but if your problem looks like a bunch of processes working on separate tasks, it may be a useful tool to manage them.

Whether it'd scale better, performance-wise, is a very tricky question, which depends a lot on the task at hand. I think flexibility is a more important aspect. If your task is monolithic and all you need is raw power then maybe k8s isn't the answer.


> Whether it'd scale better, performance-wise, is a very tricky question, which depends a lot on the task at hand. I think flexibility is a more important aspect. If your task is monolithic and all you need is raw power then maybe k8s isn't the answer.

For most (80%+) of the applications I've seen k8s used on, the performance question is not tricky at all. Monolithic performance would definitely be orders of magnitude greater.

I can't help but draw the conclusion that people are using k8s because it looks good on their CV. Whether I'm wise in being skeptical about k8s at my age is a good question.


I can't say much about deployments I haven't seen, but I am using k8s at my current job and where we use it it works quite well and makes deployment easier. I can't tell much details but it's basically processing a bunch of data in real-time in a bunch of ways, organizing them in certain manner and serving certain set of queries from the result. Before anybody asks, no, it's not ads and not clickstreams or anything like that :) And deploying with k8s seems to work decently with that.

Moreover, I can see quite a few places where on my last job (which didn't use k8s) introducing k8s deployment could help in some places. That said, "one size fits all" is never a good strategy. But I think saying "people are using k8s because it looks good on their CV" is wrong. It has its uses, just don't put it where it doesn't belong.


There's a whole lot of current, modern programming that doesn't involve JS, ML, or k8s. Heck, I'm a young programmer and I've done marginal amounts of ML work and avoid k8s beyond a high-level of understanding of what it is.


As an older programmer I have been actively avoiding JS/ML/k8s.

JS is simply garbage we are stuck with where you have to learn all this years footguns to avoid creating bad code. ML is a buzzword of limited scope. k8s is system administration by another name. Web and mobile technologies are useless to learn unless you need them RIGHT NOW as they have a half-life of 18 months.

I want to learn the "force multiplier" sitting beyond what we are using today. GC languages were the last round of force multiplier, and we haven't had much since.

Right now, the only candidate that looks to be a force multiplier is Rust, but I would jump to something else that looked like a force multiplier.

ML is a "force multiplier", but it has limited scope. It might be worth learning depending upon what field I'm sitting in.


If you mentioned Rust, Go probably goes in the same bucket - seeing a lot of it lately.


Go is just another "managed language" with some oddities (somewhat better support for concurrency useful to servers and some programming in the large improvements).

Dart goes in a similar bucket even with the native compilation.

These languages are all effectively Java with some makeup.

I'm not seeing much force multiplication. I see no new language allowing me to write something more than what I can write now.

It remains to be seen whether Rust will wind up as a force multiplier or not. But it's really the only current candidate even if it's not a great one.


I know quite a few game and embedded developers of all ages from 20s to 60s - none of JS/ML/k8s seems to be particularly relevant to anything they do.


ML the language or Machine Learning?


Haha! Given they listed k8 and JS together with ML, clearly that must be something "trendy", i.e. Machine Learning. I love how acronyms can be translated completely differently depending on where you're coming from.


That's just your wishful thinking. Here on HN I've seen both people that stayed up to date complain about discrimination and interviewers say that they expect older developers to bring more to the table - i.e. if one is a decent older developer they will be disadvantaged when competing with a decent younger developer which has learning potential.


Yeah, the issue as an older developer with age discrimination is not that you are entitled to some credit for years of experience when say, you haven't been using Java since college, but that you can't be hired on the same basis and salary as a new grad who has no experience either. Despite having demonstrated many times your ability to learn in the past.

It's easy to stereotype people who have unrealistic expectations based on entitlement because of their age, but past a certain point, a lot of employers will reject a older candidate even at the same price as a new one on the assumption that they can't learn any more.

Of course, if you are employable, you don't bang your head against the wall, you go and do something else. Like any kind of discrimination, if they were forced to accept you, it wouldn't make the culture palatable. It reduces your opportunities though.


I agree in general but "skills" are not the only valuable commodity. Real experience matters and I personally feel a lot of so called experienced veterans are more like "1 year of experience 10 times" instead of "10 years worth of experience". So that is another challenge with people who have been in the industry for a while.


Can you provide metrics for the number of people you've seen in each of these two categories? Also I'm curious about what age demographic you're in and how long you've been in the industry. Hoping that I don't sound combative asking this, I'm just really interested in your perspective.


I won't go into personal specifics, but I'm mid-career. I've been through a few companies, and made it as high up as C-suite in a smaller company, and director-level in a bigger company, before finding I prefer senior IC, tech lead, advanced development, or research roles.

Of the companies I've been to, most were highly-tech focused (and pretty elite teams), but one was a large, distinctly non-tech company (with mainframes, even, and not the modern types, running legacy algorithms). I ran a small skunkworks team there.

I can't give great statistics, aside from saying that at the tech companies, people tended to fall into the category of older is better. Senior engineers were senior.

At the non-tech company, people tended to fall into the category of older is obsolete. The tech team was there for the paycheck. They worked 9-to-5, maintained a healthy work-life balance, and kept fairly boring systems running. The work they did was really outside of the company's core competency. It just needed to be done, and someone needed to do it. If layoffs ever came, I'm not quite sure who would hire them, though. It was culture shock for me. It wasn't specific to this one company either, but to the industry (we had pretty close contacts with both collaborators and competitors). This company was industry-leading, actually.


> I've rarely seen #1 and #2 mix. They're very different crowds, and in very different types of companies.

This has been my experience, too. Generally, the closer to doing technology as the main offering -- e.g. network engineers at an ISP, or developers at a code shop -- the better and more technical they are.

Once you get into the Enterprise, where they're handling specific apps and frameworks, they tend to get stuck in patterns and whither.

That's not to say you can't be an old grognard doing COBOL at a niche code shop, but when the tech itself is the offering you find much more technically competent seniors.


All too often "keep up their skils" is code for "learn what I think is important" and says more about the speaker's (in)ability to know what's important than about actual skill. That's how ageism creeps in - not intentionally but through lack of awareness of one's own bias.


I agree with this, the nuance is though that many interviewers are convinced they filter out the #2 with their pet CS quiz questions with no regard for experience or what the company actually needs. Not talking FAANGS, this is smaller companies reaching out to recruit senior devs they supposedly need.


I can agree with the methods described in the latter part of this post. There is no right or wrong, but as I began trusting the process of simply deepdiving for deepdiving sake, I found my core muscles maintained. Being able to grok and push in a short time is a valuable asset.


Any thoughts on how to identify companies where people from group #1 work? I'm thinking of this from both a mentorship and a future employability aspect.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: