The former are a treasure trove of knowledge and skills, and provide substantially more value than anyone junior ever could. Going through so many computing eras gives a higher-level way of thinking about abstraction, or understanding computer architectures. They've hand-tweaked assembly, C, Java, and when they're now doing JavaScript or Python, they understand all the layers of metal underneath. They've gone through flow charts, structured, functional, object oriented, and all the variants there-of. They've written high-speed algorithms to draw lines with pixels, to ray trace, and are now coding GPGPUs.
The latter are liabilities, bringing in 1980-era best-practices. They're working on some legacy BASIC or COBAL system from the seventies, and surprised they can't find a new job when that's upgraded and they're downsized.
I've rarely seen #1 and #2 mix. They're very different crowds, and in very different types of companies.
My own experience is that I need to devote about 20% of my time to keep up, if I'm doing it on an ongoing basis, or about 1-3 months every 2-5 years if I do occasional deep dives. Basically, I dive headlong into whatever is the newest, trendiest stack, and get a product out using that, deeply learning all the deep things behind it too. That's what works for me. YMMV.
>My own experience is that I need to devote about 20% of my time to keep up, if I'm doing it on an ongoing basis, or about 1-3 months every 2-5 years
And that's the problem with software engineering vs other white collar careers. For example, my accountant friend is expected to be trained by their employer in the latest accounting practices and law frameworks and does't devote 1-3 months per year of their personal time on open source accounting projects to learn the latest legal framework for fun, that would be crazy for him. Same for my friends in architecture, dentistry and law. Their employers pay them to learn and gather the expertise needed for their future in the firm.
Whereas, as a software engineer, very few companies(at least in Germany from my experience) will invest into their existing workforce to train them on the job for the future language/framework they will plan to use and instead seek to let them go once their expertise is no longer valuable and hire someone already experienced in the needed stack then repeat the cycle several years/decades down the road.
That's why here you're expected to transition to management as a career progression, as IC roles are not really valued at old age unless you've dedicated your free time to coding and I don't know about you guys, but I'd prefer to spend my free time with my kids and exercising outdoors instead of coding to make myself employable in the latest stack.
As a physician, I attend conferences, subscribe to online references, question banks, various journals, take ongoing CME, repeat licensing exams, and spend the equivalent of one workday a week reading those new materials, and try to spend a couple hours refreshing myself on materials outside of my specialty. This amounts to an extra un-reimbursed workday a week, and several thousand dollars a year.
When I worked in a place that offered CME/conference reimbursement, it covered about 1-2K a year, depending on budgeting issues. In my current place, and for all independent or small practice physicians, that comes out of your own pocket.
This does not have any career benefit whatsoever; it’s done so as to be worthy of our patients’ trust.
I wouldn’t mind if it was at least partially reimbursed though. It’s an enormous chunk of change, and not for my benefit.
That's quite a workload you've set for yourself. I honestly don't know how my doctors do even the basic stuff I see them do, like seeing patients and charting. When you have to see 4+ patients in an hour (this is too many!), it would seem to me that charting would be one of the things that ends up going out the window.
I also find it interesting that you go so far as to repeat your licensing exams. Is this common among physicians as a whole? Having known a couple of med students personally, these exams were usually seen as a hurdle to be overcome and a source of stress, but, I suppose it might get easier after a few years of practice. On a related note, I find it hard to imagine that, say, lawyers would routinely re-sit the bar exam for funsies.
Regarding CME, isn't that required to maintain licensure? Or, are you talking about courses above and beyond the minimum to keep your license?
And, BTW, I don't know who you are, where you practice, or even what your specialty is, but you sound like the kind of person I'd like to have be my doctor.
Interesting how people interpret things so differently. A doctor who says that keeping abreast of their field has no benefit to them sounds like the kind of person I'd not want as my doctor.
I agree, but I also know how chronically overworked doctors are. That gives me a bit of sympathy toward the ones who don't want to basically work an extra day a week just to avoid falling behind.
> This does not have any career benefit whatsoever; it’s done so as to be worthy of our patients’ trust.
"I do it for my patients, not for my paycheck." If you have a problem with that, then by all means, I'm sure you can find another physician that would suit you better.
They specified no career benefit. I'm almost certain they view increased trust from their patients as a benefit, and the extra confidence you gain from keeping up with the latest clinical science is hard to measure in terms of personal value but almost certainly comes out to $0 in financial terms (or negative if you value your free time).
Some of it is, some of it isn't. The worst doctor you know (okay, maybe not the worst, but close) is doing at least a couple of major journals and his CME. Really, you'd be surprised, but sitting on the other side of the exam table, believe me - it's the exception that doesn't try to stay fresh. That's really not what distinguishes bad from good from great doctors - it's finding a way to integrate and retain all that knowledge so you can apply it in an unexpected clinical scenario, rather than as watercooler talk or on an exam.
I don't re-sit step exams 1/2/3, but I do a lot of ongoing question banks to refresh my boards, and I do go back to refreshing material from step 1/2/3 all the time (which is what I meant about repeating licensing exams- I see now that phrasing was unclear.) You're right that it's largely a hurdle and a stress, but that's because as a med student you're drinking from a firehose and your career depends on it. Studying it at my leisure, I can dive into things as deep or as superficially as is interesting at the time, and the broader my knowledge gets the more insights I ultimately glean from going back to those fundamentals. Memorizing biochem pathways when studying for boards is hell; refreshing biochem at your leisure just to better understand and retain is... well, if not pleasant, it's certainly not hell.
CME is required to keep your license, but that's not a problem that I have - there are multiple sources of materials that grant CME credits that I already do "for fun", so I've got an over-abundance of credits. I hunt-and-seek interesting CME courses to stay abreast of interesting things. I went into medicine for love of medicine, and the idea of getting so narrow into my niche that I lose sight of all of those other exciting things would be a tragedy to me.
As much as I appreciate the praise, honestly, you'd be surprised by how much even your least-impressive physician puts into staying up to date. There's just so much to know that the moment you stop your knowledge base evaporates.
I have seen it in teaching. Teachers need xx hours of training per year. Training often satisfies that requirement but provides no significant benefits in terms of pedagogical improvement or content knowledge.
Teachers that want to improve do so by other means. The training keeps us in compliance.
I've known more than a few high school teachers who ended up with Masters or PhD degrees kind of by default via continuing education courses. That would seem to go against your "teachers that want to improve do so by other means" idea, unless I'm confused about the nature of continuing education requirements for teachers.
My assertion is that continuing education credits, or advanced degrees are far from a guarantee of improving a teacher's practice. Continuing education suffers from "box checking." There are a number of reasons for this.
It is in no way a knock on teachers. They are caught up in a bad system and are responding to systemic incentives.
If we apply an always/sometimes/never framework to my assertion, we can find examples where teachers advanced their practice via continuing education. So the teachers you know certainly could have advanced degrees, some even very helpful in improving their practice.
My experience in K-12 as well as studying the history of education reform in America since Sputnik was launched inform this assertion. It has been a recurring theme for 60 years.
We don't gain or lose patients by it; the most recent changes in the field are often so far from settled clinical practice that they're years from anything that would be considered malpractice; we don't get reimbursed better or for it; patients largely can't tell the difference, so it doesn't change your referral stream.
It does little-to-nothing for our careers. We stay up out of pride, and out of commitment for providing our patients with good care.
Medical literature has a pretty low SNR when you look at it from the point of view of "does this help my patients?" Also, PubMed is a thing that's roughly the medical equivalent of Stack Overflow, so you can do some of this "on the fly" to an extent.
They said no career benefits. They don't usually get a promotion for knowing about X or a salary raise so its effects are indirect if they don't actually use it regularly like say a dentist knowing about dental implant options.
Does this surprise you? Interacting with my doctors has never given me the impression that they stay abreast of current literature, and their employment doesn't seem threatened by the deficit.
Doctors around here don't do anything close and pay out of pocket. Most will take sponsored vacations paided for by drug companies for being the top subscription giver.
Same for dentists usually paided for by some whiting product they will push over the next year.
Doctors on average make more, but they also start their careers burdened with overwhelming debt (a few hundred thousand dollars) and work much longer hours than software developers.
Both my parents are doctors, and I'm a software developer. And you know what? I have it really, really good in comparison.
edit: I noticed that you phrased the question as "earnings potential" -- well, in that case, it's comparable. High-level engineers at FAANG make boatloads of money.
Good points. Doctors do appear to have good career longevity though (in the US). The practice my children go to has several doctors in their late sixties. The doctor who delivered my kids was close to seventy years old. So if you look at career earnings, doctors seem to be in a better spot, as they aren't worried about finding a job from ages 50-65. Is that your parents' experience with their colleagues?
In fairness the CME/conference circuit is basically a way for physicians to legally embezzle a free vacation. Many of those conferences are held at places like Jackson Hole, the Bahamas, etc, with relatively short amount of hours per day spent in talks and the rest skiing/relaxing/drinking etc on the hospital dime.
My anecdotal evidence does not match your anecdotal evidence. I'm approaching 50, and I've had about a dozen jobs in tech starting in my teens. I've never worked anywhere that didn't allow on-the-job time for learning, and the majority of my employers have both actively-encouraged it and financed it. I also learn on my own, for fun, as my career started as a hobby and still interests me, but the vast majority of my education has been paid for during normal work hours. I also actively seek out new and interesting technology when switching jobs, and I switch jobs when things stagnate. It's what you have to do in tech, regardless of your age. If you stay too long at a company that isn't advancing your career, you're going to go stale. This isn't specific to tech either. How many mechanics, lab technicians, chefs, marketing people, stock brokers, architects, lawyers, etc. could find a job today if they hadn't learned anything in 20 years?
> I've never worked anywhere that didn't allow on-the-job time for learning
I think the issue is more just that there aren't any clear ethical standards that have been set industrywide, and since developers tend to have limited oversight, at this point it's really just a matter of what standards you set for yourself.
IMHO for things you're learning that will materially benefit your career, a reasonable standard would be that for every hour you spend on your own time teaching yourself that thing, you can spend an hour of paid time. Whereas for things that only benefit your employer, e.g. niche libraries or outdated frameworks, that should happen entirely on the employers dime.
It's extremely difficult to communicate this to a team exclusively populated by those who do not have cross-language experience.
As an anecdote, and not to boast: I have occasionally had to write data visualization and function plotting software at least a half a dozen times in as many languages and frameworks, from QuickBasic as a teenage hobby, to C in DOS, Java (Swing and JavaFX), OpenGL, WebGL, JS with charting libraries, Linux raw framebuffer... Also tweaked MRTG charts back in the day, wrote a super basic 3D editor in highschool for games I never ended up making, etc.
A team I was on once had to add a simple chart to a webapp. When the task came up in meetings and was causing the dev assigned the task some grief, I mentioned that I've had to do some charting work before, and offered to help with any details if they got stuck. Instead of saying something like, "Ok, I'll let you know if I have any questions," they said, "Yeah, but was it D3?"
If by "anything" in the phrase "hadn't learned anything in 20 years," you mean things like legal principles, rather than specific new precedents and laws, I suspect a lawyer probably doesn't have to learn much after they clear the bar exam. If any lawyers read this, I'd love to be proven wrong. :)
I think you're underestimating how fast the law changes. In a lot of practice areas (I work in tax), there can be substantial changes every few years, and in between, daily work changes as everyone converges on optimal strategies, and then the law changes again. Then there's new software constantly, which you generally need to be familiar with or get left behind. Then there are the gimmicks of the day that your clients find on the internet that you need to be familiar with or risk looking like you don't keep up with best practices (even if the new stuff isn't anywhere near a best practice, or is only applicable to Fortune 500 cos.).
I would go so far as to say that a lawyer who just recently passed the bar exam is substantially worse at the practice of law than a paralegal with 20 years of experience. The legal principles learned in order to pass the bar exam are akin to... basic algorithms (maybe?) for a software engineer. They're important, but they're also not really what the job is on a daily basis.
Nobody can argue what the next accounting rules are. Accounting practices are handed down from above and all accountants have to follow it.
Programming is not like that. If you ask 100 programmers what the new big thing is, you will get 100 different answers. I would much rather take responsibility for my own professional education than outsource it to a company that may not have my career best interests at heart.
I think a lot of this is because employee turnover in the software industry is much higher than for most professions.
An accountant, lawyer or architect can reasonably be expected to stay with the same firm for a decade or longer, often their entire career. It makes sense under that context for employers to invest more in long-term skills.
Whereas, once you get to the heart of Silicon Valley, it's not unusual for people to jump employers every 6 months. That's maybe not the rule. But even Google and Microsoft has turnover rates that imply a half-life of no more than a few years for the average employee. The economics of long-term re-training just doesn't make sense.
Is this the worse thing in the world though? It allows savvy workers to continuously jump around companies and continuously re-negotiating higher compensation packages. That helps to make sure that workers are paid at or near their market value. In a way that doesn't work in the accounting industry, because future employers would look down at your resume history.
I managed a software development team, staffed with a number of folks that each had about 30 years experience. We were a fairly advanced team that wrote image processing pipelines in C++.
When they finally rolled up our team, after 27 years, the person with the least seniority had ten years.
It's entirely possible to keep folks for long periods of time, but it requires some pretty serious management chops, and those skills are not exactly encouraged, in today's world; in any field -not just software development.
I worked for a Japanese company. The Japanese wouldn't even acknowledge my employees until they'd been around for a year. Having the same people on hand, year after year, was pretty much required, as our projects always took the long view.
I can't even imagine having an environment where engineers are expected to quit every year and a half.
No matter how good the manager is, if he is beholden to an HR department that only gives slightly above COL raises while the market explodes, people will jump ship.
Our HR was run by the Corporate General Counsel. It was pretty much as bad as you can get.
Also, we were paid "competitive" wages (read: below-market).
I was the manager for 25 years (I should mention that the engineer with the most seniority had 27 years -we started together), and I feel I did a pretty good job. Considering they did not "jump ship," I guess something went right, eh?
Nowadays, it seems that people "manage by fad," as opposed to managing humans. It's kind of heartbreaking, really.
Until they find out that while they were getting 3% raises and their salaries were well behind the developers who cane in after them that were paid at the market rate.
Did it ever cross your mind that there are more things than money which make a person stay at a company for a longer period?
For me personally, if the money is enough for me to make a comfortable life with my family, other things at work become important. I could never spend most of my days with assholes or do idiotic work even if the pay was way above market value, for instance. I'd also trade money for more free-time, if possible.
And yes, the company I work at does feel like a family. But nobody had to tell me this. It just does so, naturally.
That logic doesn't hold up. The overhead in learning your way around a Google or Microsoft-sized codebase is much larger than learning a new language or framework.
That's no surprise. Both of those companies have many individual codebases that are larger than any web framework. For instance, last I checked, Django clocked in at around 60-70k [0] Python LOC. The Windows kernel source, IIRC, is over 1M LOC, and, obviously, much lower level than Django.
---
[0]: Also, IIRC, 20-30k of that is taken up by the ORM.
This is a good comment with a lot to unpack, but I want to raise a couple points here.
First, I wonder if firms investing in training could possibly improve turnover, thereby creating a bit of a positive feedback cycle. It doesn't even have to be formal training, either. It could be something as simple as having a weekly journal club, or the equivalent, and encouraging engineers to read at least one research paper a month. [0]
The second aspect, engineers moving jobs just to get raises, seems weird to me from a market efficiency point of view. Interviewing costs companies money -- so much so that it's something they should want to do as little as possible.
Many companies don't keep pace with the open market in terms of raises, which is a primary force driving people to job hop. Are there any studies comparing companies that do at least attempt to keep comp for current employees in line with the open market against those who don't?
---
[0]: In my experience, reading research papers thoroughly can be a pretty thought intensive process. In grad school, where I studied math, what I would do is read the abstract, decide if it was interesting, then skim the section headings and statements of theorems to see if I wanted to go further. If I did, and I was searching for a particular widget I needed for a proof, then I would read as much as I needed to read to digest the proofs of the useful theorems. If it was for general interest, then I would read the whole thing. I found that once I got an interesting paper in my hands, fully comprehending what it said could take up to 1 day per page for particularly dense papers.
> Their employers pay them to learn and gather the expertise needed for their future in the firm.
I think it very much depends on the company and culture. I always had jobs where learning and self improvement was encouraged and expected (also in Germany). With a budget for conferences and books and a fixed time frame (~20%) for that. These were all companies that primarily did software development -- either direct product development or project work for customers. On the other hand you have firms where software treated as an appendage. They might have other great products but an entirely different managerial background. A mindset like: "We need a software department, everyone else has one too" can easily lead to mismanagement -- and I think a lack of time and budget for self improvement is an aspect of mismanagement in the business of creating software.
Ah, training - I remember that. Sadly I was only ever "treated" to a couple of "real" (paid-for) courses at the start of my career and the rest was on-the-job.
I've also heard of a lot of employers that are pushing the 20% policy for their software engineers or providing some other time allocated towards upskilling.
My employer requires us to do one hour of training per day. We are allowed to study whatever we want or work on personal projects. I personally don't think it's weird for an engineer to keep up with emerging tech and trends. I'm sure a lot of engineers outside of software do this.
I have a slightly different problem. My employer will pay to help us keep our skills up. We can take training classes and go to 1 related conference per year. We have some time during regular development to work on innovative projects that we'd like to implement or experiment with.
That's all great, but much of it ends up not being applicable. For example, we were trained in a new language about 3 years ago, but we haven't been allowed to use it in our product yet! I've been doing my home projects in it, so I'm ready to go when we do start using it. But most of my coworkers took the class and haven't touched it since. They've likely forgotten everything about it. Likewise, the conferences are nice, but I've never implemented anything useful after having read a paper about it, or seen a presentation on it. (The few times I've tried, it turned out the paper didn't give enough information to do your own implementation!) It does keep me aware of what's going on in the field, but I'm not sure how useful it actually is to my job.
You definitely have to push for it, only in the best of companies will everyone from top to bottom management take case of this proactively. Speak to your manager regularly that you need to expand your skills, tell him why, and show him what you want to do and how he can help. Some managers are not very good at this and you need to do the majority of the work. Accept it and do your part, but don't think training isn't needed just because your manager doesn't bring it up.
> Whereas, as a software engineer, very few companies(at least in Germany from my experience) will invest into their existing workforce to train them on the job
Maybe I've been fortunate, but in the UK and for a couple of years in Australia, I have had employers (and later clients for my contracting business) who have been happy to throw me at projects far enough outside my comfort zone that I keep learning and stretching my muscles. I feel at the top of my game (much of the time).
> ... (in Germany) ... instead seek to let them go once their expertise is no longer valuable and hire someone already experienced in the needed stack then repeat
Interesting. I thought Germany's labor law highly discourages this. Isn't "It is easier to divorce than to fire someone." a German saying for your tough labor law?
Note that these other professions have tests and certifications you need to pass. I can see developers howling in anger if they were required to do this.
I don't think it makes sense to have this in the form of required licensure to practice, but I certainly wouldn't mind if there were tests and certifications I could take that would allow me to show prospective employers what I could do. Extra bonus points if having those things on my resume allowed me to avoid the types of interviews where the candidate is essentially put on the spot for an entire day in front of a whiteboard or laptop. To their credit, I believe that's a direction TripleByte might be attempting to go in, but I don't pretend to be able to speak for them.
> They're working on some legacy BASIC or COBAL system from the seventies, and surprised they can't find a new job when that's upgraded and they're downsized.
Wow, exaggerate much?
This doesn't describe the vast majority of "older" workers, it's just another disappointing stereotype and expression of ageism.
The problem is that there is an undue burden put on those in group 1 to prove they are not group 2.
Imagine an engineer walks into an interview. It's a young person, you think nothing of it and you go on with your normal interview. Now a different engineer walks in, and he has grey hair and some wrinkles. You feel the need to dig into whether they're in group 1 or group 2, in addition to your normal interview.
I'm not saying you're wrong, but if we were talking about how there are two groups of women and one of them is a liability a lot more people would be setting off alarms.
Given the process of interviewing I think that burden is on anybody who attempts to go through the process of interviewing to a degree, young or old, male, female, or non-binary no matter the color or creed. A normal interview should dig through the qualities of 1 and 2 - although in practice many are downright insane powertripping prejudices that a stranger couldn't possibly know like filtering based upon if they follow arcane, arbitrary and archaic fashion "rules", or if they write a thank you letter afterwards.
People do think that about women, but it usually goes “Is this the kind of family-centered woman that will be on perpetual maternity leave and not pull her weight?” It’s very hard to do anything about it even if we acknowledge that it happens.
To get a rough idea of the value of up-to-date skills vs apparent age, let's consider a hypothetical:
If both a 50 and 24 year-old graduate from the same coding boot-camp, do they have equal odds of being seen after the first interview? (let alone actual employment)
And how big is the difference in probabilities?
I also note that desktop games are primarily coded in C++/C. If hiring is skill based then we would expect that industry to be zealously recruiting older engineers.
Sadly from experience, HR will blindly screen them out and speil the usual "overqualified" excuse to not progress. That they regularly do and get away with.
Had situation exactly like that once and I bypassed HR straight to manager who lambasted HR, got interview and the job, still got shafted by HR who messed up salary, lied about rise and generally made my life hell with pettiness and bullying for want of another way of putting it. That is along with one person in HR taking my side and telling me what her manager was upto and next thing, that person was gone. So yeah - HR causes many of these ageist issues when it comes to the situation you outline.
What I've heard about game development (correct me if you disagree) is that it is typically higher demanding or lower paying than gigs in other fields. What I've heard is that it's only worth being in that industry if you're passionate about it. I think older engineers will trend towards less demanding or higher paying industries, and are less likely to have as serious of passions towards gaming
> They're very different crowds, and in very different types of companies.
This is not a fact. I've seen both of your types in the same company. Let's not reduce people down to "you're either this, or this". It's not a good way of thinking. People are more complex than that and have different things to offer.
I am fascinated when the #2 group pops up to say there is age discrimination in coding - when you dig into it, the problem is about skills and not age (although I cannot really speak to ageism in the tech sector).
Every other professional career requires you to stay up-to-date in your professional skills and knowledge to stay relevant in your field. Why would coding/tech in general be any different?
Just because there are older folks struggling to find jobs after not keeping up skills doesn't mean there isn't also ageism, and it can become problematic when everyone disregards the ageism by saying it's all just meritocratic and skills based.
I think there are lots of younger developers in the hiring process who start out with a bias of assuming older developers have atrophied skills. Then when that bias makes it harder for older developers to find jobs, the younger developers say "ah, well that's meritocracy for you".
This isn't true at all. Think of other professionals - lawyers, doctors, professors, etc.
A doctor specializes in one aspect - surgeon, anesthesiologist, ER, GP, psychiatry, etc. They might pivot once in their career, but most of them don't, and they're able to find employment as long as they're able and willing.
I know several lawyers, and most of them had to specialize by their early 30's if they ever want to make decent money - family law, real estate, employment, personal injury, whatever. Again, the older the lawyer, the more seniority they have and higher they can bill at most firms.
How many professors in academia do you know that have experience teaching in multiple schools, e.g. business, engineering, social sciences, etc? Not many, usually they have a very narrow niche.
The problem with the tech industry is mostly due to offshoring, the rapid (and pointless) pace of new frameworks and tech that's mostly due to shifting dominant players, and the naivety of most software engineers who've been unwilling or unable to organize and create some sort of protective barrier similar to every other industry (teachers or cop unions, AMA, legal bar, UAW, etc.).
And again, because this is HN, the majority of developers are not worried that they won't be making FAANG salaries with sweet equity and stock options into their 50's. They're worried that they'll be training their 25 year old replacements from Bangalore at the typical mega bank or insurance company, left with only sporadic temp gigs and 6 month contracts at half their salary and with 15 years left before Medicare kicks in.
Python seems to be headed this way, as well. People tend to forget that Python is almost 30 years old already. That it's held up this long, and that it's still being developed and maintained strongly suggests it will continue to be a viable language in the industry for many more years.
There is still plenty of new development going on in Java, and I hope that continues. But I'd be afraid that if Java is all you know, you're going to increasingly be stuck on critical legacy JEE / Spring apps at banks, insurance companies, etc. Right now that's okay - there's still a lot of innovation in these frameworks. But in 10-15 years, it might be the worst kind of gig left, stuck with offshored and contracting teams of the lowest bidder.
>Every other professional career requires you to stay up-to-date in your professional skills and knowledge to stay relevant in your field.
Is this really the case? My limited experience is that the amount of constant learning expected from a coder is an order of magnitude more than in most other fields.
I work in education - if I were to sit on my hands and use anything more than the most basic foundational research and 'best practice' from when I left my post-doc program 10 years ago, I would be unemployed. I work on professional development and skill building constantly.
That's not the exception to the rule, either. You're expected to stay current on trends in your field, and stay ahead of best practices.
#2 describes a trend in every sector and business, there are a lot of people that find it harder to get work as they get older because the value of a skill set/knowledge base evaporates. From coal mining to payroll.
Definitely agree - at my last employer, there were 3 of us in our 40s. Their skill set could be best summarised by "the year 2005": Subversion, MFC, and C++03.
Is "subversion" a skill (unless we're talking about spies ;)? It's just a tool. I've used it. I've used cvs, git and other stuff too. If I had for some reason to work in a company that uses some other tool to handle their code repos, I'd learn it in a month or so (less if it's organized logically). It's like saying "driving a Honda Civic" is a skill - driving is a skill, Honda Civic may be what you're driving right now, and next week you may need to drive a BMW, and being able to do both is what I'd call a skill.
So? Subversion works fine; nothing wrong with it. Are you now a bad developer if you focus on actually getting stuff done instead of spending time migrating to a new tool?
Considering the company is down to its last couple of developers and can't attract new ones because of the old technology stack, it's definitely a problem.
I guess, but those tools alone build powerful fast software, and those programmers are solving problems which translate well to any field requiring problem solving. The software world may change rapidly, but a React / Git stack is no different than a C++ / Subversion stack when both jobs require programmers to solve hard, complex problems.
I see a lot of 40/50 year olds doing Java/Python/C++ development with CICD unit testing skills that are handy programmers without knowing much about JS/ML/k8s. To me that puts them right inbetween your #1 and #2.
As a programmer in that age range I know I don't need to spend time learning the nuances of JS if I'm not using JS. If I start on a JS project (god forbid) I can learn that then. Just like I learned C/C++/Java/Python/PHP/Ruby/Clojure etc when I started on one of those projects.
Machine learning - we know that just means linear regression or Bayesian filters plus marketing, and we prefer programming. We've also seen 20 years of "magic bullet" solutions like ML fizzle and die in the real world and know most ML projects never see a day in production.
K8s is great if running k8s is your job. But it is a specialized skill that is only needed to run very large infrastructures, unless your project architecture (read microservices) has gone very wrong.
20/30 year olds think "keeping your skills" up means learning every new programming fad that blows through because they don't have the experience to differentiate the fads from the advancements. It is like telling master carpenters they need to keep switching brands of tools to "keep their skills up". But all these tools do the same things and are 99% identical. They are busy building stuff with the tools they have.
I love the carpentry example, it's perfect! Sometimes I think tech hiring is kind of like hiring a carpenter, and asking him, "Do you use a circular saw? Because we're a circular saw shop. And I don't mean your 1952 Black and Decker saw, I'm talking a modern day Makita. If you don't have modern Makita circular saw experience, you need to work on updating your skills!"
The older I get, the less interested I am in learning new things for the sake of it. I've already programmed with a whole bunch of different environments in the last 20 years; what's the value of learning something new if it doesn't give me any benefits?
It's not being lazy or a "dinosaur", it's just better time management.
The absolute worst devs I've worked with were very "up to date" people who wanted to re-do all the already-working "outdated" stuff to new "modern" standards. Often, it's just a waste of time.
Very much this. Knowing particular set of keywords that constitutes latest fashion language is a short-term skill. Knowing the paradigms that guide all of them and being able to learn the keywords if necessary in a short time is a long-term skill. Some companies prefer to "hire to the spec" to save short-term learning costs. Smarter companies look into long-term skills which will be useful whatever keywords are in fashion this season.
Which is typically a symptom of having worked at a start-up vs. an established company.
During the dot-com crash, I went from working at a 40 person start-up to a 25,000 employee utility company, and it was a real eye opener. A lot of my "cutting edge" (for the time) skills were dismissed as being flash in the pan, and all the "real work" was done with tried and trued technologies. I ended up finding my way back to a start-up a few years later, and everything was reversed again.
That says "startup". ML is the buzzword that investors love, and k8s/devops allows avoiding big investment into infrastructure which may need to be dropped anyway when it turns out the market doesn't actually want yet another "apply ML to click stream to save on ads costs" startup (I'm stereotyping of course but you get the idea).
I'm 42, and until recently worked for a well known tech company in San Francisco. Most of the time I didn't feel like I was the only person over 30 in the room, but after moving to another city and starting to work on a truly age-diverse did I realize how unbalanced my previous team was.
My current team has a good mix of industry experience and excitement for new technologies, which makes planning both effective and exciting.
Part of the problem is that SF is just so damn expensive... it's going to self-select for people that can afford to live without the additional burden of a family and that tends to be people <30.
Some parts of SV push for diversity and inclusion, but when push comes to shove, firms are quite happy to protect established power structures - shitty managers, retaliatory practices, toxic culture, etc.
No, I am one of the people I just described. I dont know where I fit in with the OP. I'm struggling to keep up with the avalanche of new stuff coming along.
As a veteran dev in my 50's, "k8s" gives me the screaming heebie-jeebies. I've just about got my head around Docker. But it's painful seeing a system that should be a nice little monolith serving a few thousand requests an hour split up into microservices and "managed" using k8s for no good reason.
I realise this might make me unemployable in a modern web dev environment. Maybe I can just ride it out until the industry goes through the rest of the cycle and rediscovers simplicity.
k8s basics is pretty simple actually. If you know Docker, k8s basically is a way to keep a bunch of Docker containers running according to a bunch of YAML configs. There are all kinds of fine details, but the gist of the thing is just that. Of course as every tool it's not always used properly.
It's not the complexity that concerns me. I have dealt with more complex things ;)
It's why people use it in the first place. I get the need for it when you're dealing with huge scale. But it seems to be the new default deployment model, for services that really, and I mean really, don't need to scale that much.
And I've seen people justify breaking a nice monolith into microservices (usually badly) so they can deploy it easier using k8s. Which is totally putting the cart before the horse.
Easier to run a small service in a predictable environment where nobody can step on your toes. Also pretty easy to adjust resource allocations, update pieces independently, easier to isolate screwups (1 part going down sometimes is better then whole thing going down), etc.
I mean, of course you can't approach the task as "we want to deploy on k8s first, no matter what" - of course you have to consider the task at hand and if it works better as monolith - keep the monolith (you can still use k8s - it's just a way to run code, any code can be run within it). But if the task suits the model - e.g. many data processing/transformation/ML workflows do - having a tool like k8s can make deployment easier. One doesn't have to make a religion out of it, but if your problem looks like a bunch of processes working on separate tasks, it may be a useful tool to manage them.
Whether it'd scale better, performance-wise, is a very tricky question, which depends a lot on the task at hand. I think flexibility is a more important aspect. If your task is monolithic and all you need is raw power then maybe k8s isn't the answer.
> Whether it'd scale better, performance-wise, is a very tricky question, which depends a lot on the task at hand. I think flexibility is a more important aspect. If your task is monolithic and all you need is raw power then maybe k8s isn't the answer.
For most (80%+) of the applications I've seen k8s used on, the performance question is not tricky at all. Monolithic performance would definitely be orders of magnitude greater.
I can't help but draw the conclusion that people are using k8s because it looks good on their CV. Whether I'm wise in being skeptical about k8s at my age is a good question.
I can't say much about deployments I haven't seen, but I am using k8s at my current job and where we use it it works quite well and makes deployment easier. I can't tell much details but it's basically processing a bunch of data in real-time in a bunch of ways, organizing them in certain manner and serving certain set of queries from the result. Before anybody asks, no, it's not ads and not clickstreams or anything like that :) And deploying with k8s seems to work decently with that.
Moreover, I can see quite a few places where on my last job (which didn't use k8s) introducing k8s deployment could help in some places. That said, "one size fits all" is never a good strategy. But I think saying "people are using k8s because it looks good on their CV" is wrong. It has its uses, just don't put it where it doesn't belong.
There's a whole lot of current, modern programming that doesn't involve JS, ML, or k8s. Heck, I'm a young programmer and I've done marginal amounts of ML work and avoid k8s beyond a high-level of understanding of what it is.
As an older programmer I have been actively avoiding JS/ML/k8s.
JS is simply garbage we are stuck with where you have to learn all this years footguns to avoid creating bad code. ML is a buzzword of limited scope. k8s is system administration by another name. Web and mobile technologies are useless to learn unless you need them RIGHT NOW as they have a half-life of 18 months.
I want to learn the "force multiplier" sitting beyond what we are using today. GC languages were the last round of force multiplier, and we haven't had much since.
Right now, the only candidate that looks to be a force multiplier is Rust, but I would jump to something else that looked like a force multiplier.
ML is a "force multiplier", but it has limited scope. It might be worth learning depending upon what field I'm sitting in.
Go is just another "managed language" with some oddities (somewhat better support for concurrency useful to servers and some programming in the large improvements).
Dart goes in a similar bucket even with the native compilation.
These languages are all effectively Java with some makeup.
I'm not seeing much force multiplication. I see no new language allowing me to write something more than what I can write now.
It remains to be seen whether Rust will wind up as a force multiplier or not. But it's really the only current candidate even if it's not a great one.
Haha! Given they listed k8 and JS together with ML, clearly that must be something "trendy", i.e. Machine Learning. I love how acronyms can be translated completely differently depending on where you're coming from.
That's just your wishful thinking. Here on HN I've seen both people that stayed up to date complain about discrimination and interviewers say that they expect older developers to bring more to the table - i.e. if one is a decent older developer they will be disadvantaged when competing with a decent younger developer which has learning potential.
Yeah, the issue as an older developer with age discrimination is not that you are entitled to some credit for years of experience when say, you haven't been using Java since college, but that you can't be hired on the same basis and salary as a new grad who has no experience either. Despite having demonstrated many times your ability to learn in the past.
It's easy to stereotype people who have unrealistic expectations based on entitlement because of their age, but past a certain point, a lot of employers will reject a older candidate even at the same price as a new one on the assumption that they can't learn any more.
Of course, if you are employable, you don't bang your head against the wall, you go and do something else. Like any kind of discrimination, if they were forced to accept you, it wouldn't make the culture palatable. It reduces your opportunities though.
I agree in general but "skills" are not the only valuable commodity. Real experience matters and I personally feel a lot of so called experienced veterans are more like "1 year of experience 10 times" instead of "10 years worth of experience". So that is another challenge with people who have been in the industry for a while.
Can you provide metrics for the number of people you've seen in each of these two categories? Also I'm curious about what age demographic you're in and how long you've been in the industry. Hoping that I don't sound combative asking this, I'm just really interested in your perspective.
I won't go into personal specifics, but I'm mid-career. I've been through a few companies, and made it as high up as C-suite in a smaller company, and director-level in a bigger company, before finding I prefer senior IC, tech lead, advanced development, or research roles.
Of the companies I've been to, most were highly-tech focused (and pretty elite teams), but one was a large, distinctly non-tech company (with mainframes, even, and not the modern types, running legacy algorithms). I ran a small skunkworks team there.
I can't give great statistics, aside from saying that at the tech companies, people tended to fall into the category of older is better. Senior engineers were senior.
At the non-tech company, people tended to fall into the category of older is obsolete. The tech team was there for the paycheck. They worked 9-to-5, maintained a healthy work-life balance, and kept fairly boring systems running. The work they did was really outside of the company's core competency. It just needed to be done, and someone needed to do it. If layoffs ever came, I'm not quite sure who would hire them, though. It was culture shock for me. It wasn't specific to this one company either, but to the industry (we had pretty close contacts with both collaborators and competitors). This company was industry-leading, actually.
> I've rarely seen #1 and #2 mix. They're very different crowds, and in very different types of companies.
This has been my experience, too. Generally, the closer to doing technology as the main offering -- e.g. network engineers at an ISP, or developers at a code shop -- the better and more technical they are.
Once you get into the Enterprise, where they're handling specific apps and frameworks, they tend to get stuck in patterns and whither.
That's not to say you can't be an old grognard doing COBOL at a niche code shop, but when the tech itself is the offering you find much more technically competent seniors.
All too often "keep up their skils" is code for "learn what I think is important" and says more about the speaker's (in)ability to know what's important than about actual skill. That's how ageism creeps in - not intentionally but through lack of awareness of one's own bias.
I agree with this, the nuance is though that many interviewers are convinced they filter out the #2 with their pet CS quiz questions with no regard for experience or what the company actually needs. Not talking FAANGS, this is smaller companies reaching out to recruit senior devs they supposedly need.
I can agree with the methods described in the latter part of this post. There is no right or wrong, but as I began trusting the process of simply deepdiving for deepdiving sake, I found my core muscles maintained. Being able to grok and push in a short time is a valuable asset.
Any thoughts on how to identify companies where people from group #1 work? I'm thinking of this from both a mentorship and a future employability aspect.
- I love coding, and I still code every day. I have no desire to move into other roles (such as management).
- As for job security, I think it is harder to find a job as a manager than as a developer, simply because there are many more positions for developers than for managers
- Family-wise, it is much easier at my age, because my children are getting into their twenties now. Much more free time for me (including for work if I want), compared to when they were younger and need lots more attention (and being picked up from daycare)
- As for learning - that's one of the attractions of being a developer. There are always new things to take in, it never gets boring
- On the other hand, even if some things in programming are fleeting (the currently popular frameworks etc), there is a core of fundamental knowledge that you gain over the course of your career that will always apply.
50+ here. Never had an issue. Many media articles in this context also "consult" or "quote" or "interview" someone from a company that "consult older software developers how to stay in the market". One note: you have to respect it from both sides, and factor out age when young person is your manager or colleague. Your age is not an argument in a technical discussion.
Not everywhere is a high demand market. Brazil here. The job market really wants low paid workers that will work Saturday nights for peanuts. A lot of my older colleagues emigrate to have good technical jobs. Others become sellers or open a restaurant.
Very much agree with this. As someone who’s managed people 20+ years my senior, it always annoyed me when I overheard them complain about reporting to someone younger. I don’t hold their age against them, it should be reciprocal.
But the age gap alone is not a valid reason to complain. The young manager could be in their 30's with over a decade of experience themselves, and they may be highly competent at the work they do. Assuming the young person is not qualified to manage veteran programmers is also a form of age discrimination.
If they're managers they should be in management jobs. If they're not, why should they complain about not being management? Management is its own job, not a "reward" granted for time in service. Not even the military works like that.
The point is that it’s still age discrimination. If someone wants their age to not work against them, they have to be willing to look past the age of others.
Besides age is not an indicator of quality of work, nor are college credentials. I’ve seen much older devs run circles around younger ones, and vice versa.
I am sometimes tempted to tell colleagues "well I've done this professionally since before you were born, so...", but so far I've resisted the temptation. Hope I can keep the streak going :)
Even if you're right, it's not a great argument, because there isn't really a rational counter argument.
If the argument is only based on having more experience - the "I've done this professionally since before you were born" argument - it can be correctly rejected as an argument from authority fallacy.
If the argument is based on a pattern you've seen before that's germane to the situation at hand - the "I've seen this movie before and it ends with regret and a data breach" argument - it can be much more convincing.
You don't influence by pulling the seniority card, you do it with data. If there isn't data to back up your suggested approach being better, it might not be, and you get to learn something new!
Sourcing data is expensive. I think it's best not to underestimate just how much development has to be done based upon gut feel, trust or experience simply because proving it would take too long.
If it's two people arguing their own opinions at each other and neither one has relevant data where exactly do you go from there? The one with 2 years experience or the one with 15?
Sometimes the data is the experience, and it's hard to dump your personal experience on someone and make them ingest it. Sometimes you've tried approach X a couple of times and it always failed, but you don't have a scientific proof it will fail again. You just know from experience if you do X it usually ends up in tears and all-nighters and missed deadlines. Not because you have an Excel table proving it, but because you've lived it.
If we are going with anecdotal evidence, I'm 55, in top 1% in my field with comprehensive knowledge of modern and cutting-edge technologies. After age 45 was denied promotion and had to look for another employment; was fired (from another employer) at age 52, was out of work for 7 months and had to take a position way below my level. Still there but could be fired at any moment (cost cuts, esp. under current circumstances). Receiving a lot of calls from headhunters based on my excellent linkedin profile, but never went pass first interview in 3 years.
I'm only half kidding here, but have you tried dyeing your hair, wearing slightly "younger" clothes, not including everything on your CV (and definitely no graduation year, etc.)?
It seems ridiculous but I wonder if it would work.
It's not ridiculous: I'm partly bold so I always shave my head. I tried to remove graduation years from my resume and did not find it's useful; it would only work if ageism was a sin of a few. Now I keep all my cards open: if my age would be the reason I prefer to skip erroneous interviews that hiding it would have brought (wasted 2 months and 4 interviews, Amazon, yeah?).
> - On the other hand, even if some things in programming are fleeting (the currently popular frameworks etc), there is a core of fundamental knowledge that you gain over the course of your career that will always apply.
Spot on. Ignore the popular framework of the month.
A lot of developers focus only on the surface (popular tools) and neglect fundamentals, algorithms, OS internals, how CPUs work, networks...
You can't ignore it. If you ignore the framework that is a requirement for 20-30% of advertised positions, then you are limiting your ability to get hired. That may be good advice if you are the one architecting the new project, but not if you are looking for a coding job. If you are looking for a new coding job, look at what people are hiring for and at least get familiar with a good number of those things.
Of course if you aren't having trouble finding a job then you don't need to follow this advice.
I interviewed a whole lot of people when I was in Amazon and no candidate was rejected, in my memory, for not knowing a tool, a framework, one language or one category of technologies.
The interview topics of on-sites are general enough and focused on principles, and, by far, the large majority of candidates are rejected for giving vague and superficial answers.
My experience convinced me that, as a candidate, I don't want to divide my time preparing and interviewing for many "typical" companies. I instead focus on very few interesting ones and maximize my chances there.
"shotgun versus sniper" if you will: you need one good offer, not 10 so-so ones.
Among other reasons, the typical buzzword-driven company is less likely to be an interesting environment to work in.
You won't apply to to 100% of jobs. So you already make a selection.
Some jobs are require highly specialized skills and knowledge. Keeping up with the framework-of-the-month costs time as well. Time you could have spent getting better at Linux kernel programming or Cobol.
"If you ignore the framework that is a requirement for 20-30% of advertised positions,"
The "framework of the month" is not "a requirement for 20-30% of advertised positions". The "framework of the month" requires a minimum of a year to get to that point and 2-3 years is a lot more common, even in the JS frontend space. By the time something gets to 20-30% penetration it's past that phase.
I've tried to keep one main language and work with something different each time. It creates diversivity in my resume but allows me to use my core anchor language to find a role.
Well, one option is just to ignore framework of the month entirely. It is perfectly feasible to just pay attention to things that prove themselves to the point that they're 10% of the relevant listings and have sustained that for a year or two.
A mature engineer doesn't need to be bleeding edge. It's perfectly fine to merely be tracking technologies that have gotten to their high-growth phase.
The other thing I try to do is have coverage, such that even if I don't know a particular tech I have a story that says I can learn it easily. I've used enough DBs that I don't need to chase the latest things; I'm confident that even if I've never used a columnar database that I can pick it up quickly if I need it. I've used enough programming languages that I don't need to go chasing the latest one, because it is still frankly mostly a different spelling of things I've already used. And so on. So I don't need to go chasing everything all the time. I generally don't pivot into anything brand new, because within the domains I work in, there isn't much fundamentally new stuff left for me to pivot into.
(A lot of my recent growth involves learning how to do engineering while being more directly tied into the business and interacting with business people, and learning how to be an interface between business and tech, rather than more types of tech. I'm doing this from an engineering perspective rather than a "management" perspective, because it turns out there is quite a difference between the two once a company scales up enough.)
Why ignore? Take a look, have an idea. Add another data point to the chart of where the industry is going.
Once you have seen, say, 15 frameworks, it becomes easier to map a 16th in the coordinate space, and quickly learn it when needed by reusing the knowledge you already have.
I agree that it brings clarity. You need good soft skills though, or else that clarity can become bitterness as your pragmatic advice is ignored and your teammates wonder why you always seem so oppositional.
Hasn't happened to me, but I've had a couple older coworkers who seemed to be in that place, which just seems like such squandered experience.
>> but I've had a couple older coworkers who seemed to be in that place, which just seems like such squandered experience.
It is squandered. It's one thing to know the right path based on experience. It another to be able to share that experience in a way that benefits others and helps them understand why it's the best choice. Nobody knows everything either, so having a real conversation can result in learning both ways.
Question regarding your last 2 points: how do you reconcile the fact that "there is always something to learn" with "I'm basically just learning a library/framework that does the exact same thing but with different names in the API"?
I'm being a little hyperbolic here, but after a few decades in the industry it's kind of baffling how few novel ideas are in these new frameworks. There is even a trend to go back to server side HTML and SOAP-like APIs. The tech industry sometimes just looks like the fashion industry. But maybe I'm just too cynical :)
> I'm basically just learning a library/framework that does the exact same thing but with different names in the API"?
I won’t say I reconcile anything, but I get a chuckle out of it. Makes it easier to learn the new flavor.
As The Who said, “Meet the new boss. Same as the old boss.”
To add another example to the two you provided: how many different designs and implementations have you seen for pub/sub and message-based systems (topics/queues). I literally lost count. Even fucking Redis has an implementation!
They go back at least to the 1970s.
(Remember Enterprise Service Busses in the 90s?)
Things are constantly improving, even if the road is winding. People just forget some of the horrors of the past. We used to have VB/VBA, now we have Javascript. We used to have Ada, now we have Rust. C++ has evolved.
Even C has evolved, in a kind of meta way: C itself hasn't evolved much, but the ecosystem around it has evolved a lot. How common was something like Valgrind back in 1990?
I think there is some cyclic property here, but I like to think of it as a sort of meta-refactoring. As we go back to SOAP-like APIs, as we return to mainframe computing, as we rediscover peer-to-peer networks, we take all the knowledge and improvements from exploring an alternate area with us. The advancement is in the nuance.
And, of course, there's also just the Eternal September effect with the vast majority of developers needing to learn things that the grey beards have already learned but have no succinct way to communicate (or just aren't heard over the roar of constantly greater numbers of developers with each passing year)
No question about the job security angle - a software engineer in tech is the best job security you can get. I also agree it has better work life balance than mangers, as long as you can get yourself motivated to get things done.
> a software engineer in tech is the best job security you can get
Can you explain this? Anywhere I have ever worked (Fortune 500 orgs) I have always had an "at will" employment contract as a W-2 employee. They can fire me at any time for any reason. Furthermore, when budgets are cut typically the first group to go is the IT department, since they do not bring in any revenue, unless the company is selling IT products and services.
I think there is a difference between working at a company where software is the product, vs a company where software only supports the main product (i.e. an IT department). Companies where SW is the product value developers more I think.
From a management perspective a company where software is not the product makes the software team is what's called a cost center. And companies love cutting costs. At a company where software is the product, the software team would be treated more as a profit center. In general, you want to be in a profit center, not a cost center. Hell, half our homework in accounting 102 was problem sets calculating if it was economically beneficial to outsource a given cost center - even at the most basic level cost centers are places that people want to cut.
> Companies where SW is the product value developers more I think.
And the IT budgets are different. Very different. On a related note, remember when some organizations had IT (and specifically web dev) under their marketing departments?
In case the OP doesn’t reply my interpretation is there is way more demand than supply so sure you’re “at will” but you could find another job quickly. As for fortune-500 and IT layoffs I imagine we are talking here about technology companies for which “IT” isn’t a line item to be cut but _is_ the business. Rules may be different at BestBuy or Bank of America but I bet not by much.
Oh, that's nothing to worry about. Any halfway decent software engineer who gets fired on Thursday should be able to just waltz into a new job the following Tuesday.
I keep reading that on Hackernews, so it must be true!
High demand is still providing pretty good job security. I'm fortunate to be in the us and get paid a good bit more than if I was in another country. Many many companies want developers in the us for various safety reasons. If you do work on a generic project that could be done safely by people in other countries of course that reduces your safety.
I think this is the key. Developers in our age range that keep their skills up and keep learning new things can be very productive and an asset on a team. You have to stay humble and admit you still don't know everything.
This is ignoring the stereotype of the recent college grad or someone with just a couple of years experience who thinks they know everything. I see everywhere at my work. When you take into account that 90% of software developers of all ages are just not that good at their jobs, things start to make a lot of sense.
I’ve had a hard time adjusting to the AWS devops phenomenon. There is so much damn uninteresting configuration and button-pushing, rather than engineering, that I get bored with it. Yet I feel social pressures to get various AWS devops certifications to remain on top of things. Previously used Jenkins successfully for many years so I have devops experience, but there is something about EC2 instances, elastic beanstalk, load balancing, etc at AWS that just makes me want to retire (yet I still get tremendous joy from programming and designing).
Get used to it. Engineers are expensive, so if they can be replaced with button pushers they will be. Learn to push the buttons if you want to stay in the industry. You'll protect your value as an engineer because an engineer who can push buttons is more valuable than an engineer and a button pusher (especially if the company can get away with not paying an increased salary for the increased responsibility, which they usually can).
And besides, AWS has a stable API and a set of CLI tools to actually manage everything in the cloud without button pushing.
I am 29 now. Used Jenkins about 5 years ago because I couldn't find a much better alternative for a single-developer workflow, but it's a glorious pile of buttons that has 0 consistency and/or an API.
We manage, install and configure Jenkins as code, without clicking buttons. You can fully automate jenkins activity and setup, from install to job creation, execution, plugin updates, config changes validation etc. Of course only power users will fully use Jenkins power.
Jenkins isn’t really worth the hassle today in my opinion - there are a bunch of new build tools that solve the problem better without all the garbage that comes with it.
And they also look better.
For god’s sake, you could even use GitLab today. What does Jenkins give you? What is there to defend?
For complex pipelines and code reuse (plugins, pipeline libraries, scripting) between projects, Jenkins offers more features for power users, and its maturity has been proven while GitLab CI still suffers from his young age.
Other examples with the Cons at the end of this article : https://medium.com/sv-blog/migrating-from-jenkins-to-gitlab-...
and https://stackoverflow.com/a/37430097/2309958
(and I can probably find a lot more :-) )
First of all, Jenkins was created in 2005, when REST was barely past the stage of being a gleam in Roy Fielding's eye. Yet, almost from the start, almost every page had an API (you'd add /api after the URL for its corresponding HTML page). With XML, JSON and Python (!) formats, with built-in search and filtering.
Secondly, to your point, there you go: https://stackoverflow.com/questions/17716242/creating-user-i... I'm pretty sure that the reason there is no REST API for it is because you're supposed to be using your favorite back-end (LDAP, AD, etc.), with which Jenkins can integrate.
It's quite disingenuous to complain about the Jenkins APIs, there are a lot of them. They're not perfect or designed necessarily like something you'd design in 2020, but they are there.
You can try out Microtica (https://microtica.com), a tool that abstracts the complex cloud setups and provides easy configurations from the UI, so you don't have to spend time on boring stuff and continue the engineering you love.
Try it out and let me know what you think.
college professors can get stuff done in their 70s, so should be software engineers as long as they keep learning, both demands a working brain. software is getting complex, experiences matter.
I used to work for motorola, engineers with gray hairs are gems, full of knowledge and always learning new stuff, eager to coach new comers. it unfortunately stopped when outsourcing became a fashion.
I'm old too, I kept learning everyday, delving into c++17 these days, and it just makes my day fulfilling.
Coaching and mentoring turns out to be as valuable for the Teacher as it is for the Student. The good ones will take the time to do this - even if the original plan was to shape the thought process of their evil minions. :)
I have always worked for product companies, on products that I find interesting. Currently it is a product for margin calls in finance, before it was SMS routing and delivery, before that VoIP. Even if everything I do isn't new or interesting, there is enough variations and challenges, both in the code and also learning more about the domain.
The times I have started to feel that it was boring and repetitive, I have changed jobs (but I've stayed at least 5 years in each place).
I also like to try to get better at what I do as a developer. Mostly this is by reading books or taking MOOC courses. I think there is quite a lot to learn about developing SW well (because it is a really complex activity). So that also keeps me interested.
About eight years ago I also started blogging [1] about SW development. I've found that trying to formulate what I think about it has also kept it interesting.
Thanks for sharing. I'm currently doing fullstack web development and find it not as interesting anymore as it used to be (close to 6 years now). I'm trying out various projects but it is either need higher learning curve such as deep Math (i.e., machine learning, graphics) or domain specific knowledge which I usually don't have expertise about.
I'm a former tech recruiter for startups and now a resume writer and career consultant, and I've written for software engineers from 18 through their late 60's.
- Older software engineers (40+) won't be discriminated against if they are doing cool stuff and are doing different stuff than they were doing a few years back. I've written this a thousand times before, but we often confuse ageism for stagnation. If you've had the same job for 15 years working on the same system and using the same languages, you aren't a victim of ageism as much as you are being discriminated against because they don't think you'll be able to learn new tricks.
- On resumes, don't advertise your age if you're a bit older. We don't need to list that internship, first job out of school, or graduation dates from college. List 15-20 years of your career and leave the rest off. It isn't a biography.
> Older software engineers (40+) won't be discriminated against if they are doing cool stuff and are doing different stuff than they were doing a few years back.
But I don't do cool stuff. What I do is to keep a ~$25 million per year money printing system from falling over while the youngsters are in year 7 of the 18 month project to replace it with super cool tech. Making money is so overrated these days.
You'd be surprised how little corporate bureaucracy values the act of "keeping the lights on" for an existing and stable revenue stream. Obviously, keeping the money hose going is good for the business, but it only gets recognized in the case of failure. To put it simply, it doesn't create new value, which is what corporate culture in the US is all about.
Coming from “sysadmin”/tech/keeper of the lights on: there is no win at the bottom line. Keeping the lights on always costs money, they always wish it could cost less (except when inter-company bragging about having the new hot tool), and when you’re doing your best possible job they will forget you exist. If something happens on your watch they can just as easily assume you have not been working at all.
One thing I’ve distilled from all this: Always keep at least some focus on how much your work costs vs how much value it’s providing.
Maybe what you do IS cool, but you expect others don't think it is, or you don't make it sound cool when you talk about it. I've made some pretty mundane work sound cool, but it's not always easy.
You may be right about that. I should note, however, that I am not out there looking for a new gig. I have kids, am getting a master's degree part time, and have a small side business. Yet I still have time to read books at night. Find me another job where I can do all that :)
having been part of many candidate interviews you would have massively scored in my book with this comment. you have values other than work. that's something I value very highly.
wish there were more like you these days to interview (I am just 42, so take it with a grain of salt) but the youngsters, straight from college with a streamlined CV are sometimes difficult to deal with imho.
I have a disdain for people that don't finish what they've started and don't satisfy requirements. I don't feel as though that has much of anything to do with age. Anyway, if that makes me hard to work with, so be it.
Not necessarily. If you list say 15 years of experience and a degree with no date, the reader will assume you are "at least 37". We're just trying to get an interview here - when you show up they'll have a better understanding of your actual age (in most cases), but at that point they're committed to the interview and hopefully the candidate performs well enough where age won't be a factor in any decisions.
At this point in my career, if I was prepping a resume, my education section would have one line:
Wossamotta U, Computer Engineering, BS
Doesn't need a date or any details, my job experience is more relevant, and if they've got questions, they can ask. If I had another good stint of work, I could just put Education: yes. Also going to leave off my college jobs and write Recent Experience. Although I got my last job through networking without writing a resume, and I'd expect the same for future work.
I list education without graduation date because I attended for 2 semesters. I've never considered it labeling me as old, but I've never had a hard time getting jobs, with the exception of a few jobs near me that only hire out of specific universities.
I do the same. I went to college for 2.5 years, never completed a degree. I'm not going to lie and claim a graduation. I used to list the date range I went to school, but now I leave that off entirely. No one has ever once asked about school. With 20+ years of experience on my resume, I doubt they even read the last section.
> On resumes, don't advertise your age if you're a bit older. We don't need to list that internship, first job out of school, or graduation dates from college. List 15-20 years of your career and leave the rest off. It isn't a biography.
I started doing this years ago. If I listed every job on my resume, it'd be 4-5 pages long. Most of them aren't relevant to what I'm doing now, and only serve to make me look old, or like a job-hopper (which I am--about every three years--but job-hopping in tech isn't a negative unless your list grows too long). At first, I compressed older jobs into one-liners at the end, but now I leave them off entirely. I list 4-5 of the six jobs I've held in the past 22 years.
I have no way of knowing what impact this has on my employability, or how people interpret my age based on it, but I'm 47 years old with 22 years of experience on my resume, so I assume I come off as both old and senior-skilled. I never have to try hard to find jobs.
I feel job hopping in tech has gone from a negative to a positive in the eyes of many who are hiring, as long as the hops are for good reasons (better opportunity, company shut down, finished what you were hired to do, etc.).
Jumping around because you can't keep a job or nobody likes working with you is another story.
In Germany you're expected to include your age and a headshot on your CV[0]. I always considered that an insane invitation to prejudice and have only ever done it once, for a job I was guaranteed to get.
I'm sure other countries have other expectations.
In the US it's easy to forget that not everybody is "against" discrimination in hiring, even in tech.
I've written for clients in probably 50+ countries, and many do commonly include photos and birthday. I would hope people are 'against' discrimination.
Your first statement contradicts the second one, which only confirms a theory that recruiters aren't the smartest people. If you want to argue that ageism doesn't exist, please provide some stats from your former work. I believe that the real situation is even worse than we can observe exactly because people over 40 had to clutch for the old uncool systems/languages just to keep their job.
"doing different stuff than they were doing a few years back" so engineers who fail and need constant redos? Because the project I'm working on has had the same stable and productive architecture and tools for years. The only reason to redo it would be if the opposite were true. Our users don't know what technology we use. They just know the product does what they want, and keeps getting cool new features.
Who said anything about failure? If you're happy doing the same stuff day in and day out, that's great and I hope it works out for you. In my experience, people doing the same thing for a long time tend to struggle to find jobs when it becomes necessary.
If you keep building 'cool new features', you don't appear to be doing the same thing over and over again.
The skills to build from scratch and the skills to maintain are different. If you've only been doing maintenance, why would you expect to be at the top of the list to build something new? Maybe you personally are perfectly skilled to do it. How would anyone know?
I'm 35, so kind of in the middle. Frankly speaking,I'd hate to work in a company where an average age is 20 as much as where an average age is 60. Sometimes I see job ads with things like ' youthful colleagues',which is crazy. I want to work in an environment,where I can talk to someone in their early 20s and gain from their perspective and understanding and also to someone who's been round the block a few times and can tell off the bat that the shit I came up with minutes ago is neither smart nor useful because X,Y,Z.
Putting all this aside, one thing to consider is that there are there are tons of companies out there that do have reasonably easy operational model,which,once streamlined, doesn't require senior people,or only very few. And those companies are very happy to keep thr status quo by only employing from certain demographics.
In my country there is an industry in setting up your software company to simulate university life to attract the newly educated developers.
You’ll get cheap, extremely motivated employees, who spend almost all their time at your company thanks to boardgame/role play/pizza nights and you get them to keep up-to-date on techs by having them do weekly tech-workshops and presentations.
They produce a lot of cheap code, sometimes the quality is decent, other tones it isn’t, and their most talented developers tend to leave for “adult” jobs after a few years, but overall it has been a very successful strategy.
But younger engineers can be had more cheaply than experienced engineers. There is also a culture of having to prove yourself which often leads to junior engineers taking positions with companies that are toxic and which dramatically underpay them in order to get some years of experience. I think in our pandemic world, there is also a problem where people who may want to change jobs, due to poor working conditions, will stay put in fear of losing their pay check. These things and more, I'm sure, all compound against all of us, but I think can be felt more strongly at the lower end of the experience range.
oh yeah, the "pizza" party. 50+ developer here. As I age, pizza and games feel more like going to Chuckee Cheese (a US pizza chain for children's birthday parties). I rather watch a soccer match at the nearest bar or play freesbee in the parking lot.
I would go a step further by suggesting that diverse workplaces are healthier since people are more inclined to consider the perspectives of others. The most toxic workplaces I have experienced have been monocultures. In those cases exclusionary behaviour ran rampant, even within a peer group.
Have you worked in a company where the average age was that low?
When I joined Snap about 5 years ago, I was middle aged compared to the average which was probably in 20s - even my CEO is reasonably younger than me :). The company was also pretty small (100?).
From that (single/anecdotal) data point, I'd say it is not as bad as you portray. There is definitely an affinity to try new things - part of your job is evaluating it for all its worth rather than its glitter and ensuring you communicate that well / provide value. I always try to focus the conversation on the problem we are trying to solve and how the new tech is solving it better. The good alternatives usually seem to catch on very quick (See Go vs Java, Kotlin vs Java) compared to the also ran technologies. Your worth will kick in based on your skill level in judging these and definitely in getting to a mentoring role rather than a gate keeper role. But it has been a lot of fun.
The agility is also something to keep in mind: Snap allows for extreme career mobility. Recently I switched teams from doing data related work (for the past 15 years?) to a complete unknown of joining our spectacles team and Snap has been supportive. You just don't see the typecasting you tend to hear about in bigger companies.
So yeah - give it a shot with an open mind! You may be surprised!
And this is exactly the problem with tech. At 35, you're only about one decade into your career.
Lawyers, doctors, professors, etc. at 35 would still be considered relatively young, with plenty of time to become partner, senior surgeon, obtain tenure, whatever. But you're now at peak salary, and can expect inflation raises now.
And worst of all, you've got 30 years left until you can obtain Medicare, so if you aren't working full time at a job with decent benefits, you're absolutely screwed, especially if you 'dare' decide to have a family, 30 year mortgage, etc.
People often live 'till mid 80s nowadays, and can't receive any sort of benefits (in the US) 'till mid to late 60s. If your career is peaked and trending downhill at 35, and you're not making millions of dollars in your prime (professional sports, fashion model), there is something wrong with your industry.
> Lawyers, doctors, professors, etc. at 35 would still be considered relatively young, with plenty of time to become partner, senior surgeon, obtain tenure, whatever.
After ten years in a mature field, haven't you learned, or even mastered, a large portion of your specialty? You're 95% there already?
I think OP was making the case that at age 35, non-tech workers don't have ten years experience; rather, they are just starting their professional careers due to school.
As I've gotten older my time has gone from '20% think about it - 80% code it', to '80% think about it - 20% code it'.
I currently work with a surprisingly well balanced spectrum of developers at my current employer. The median age is in the early 30s, with probably a third of us 40 and older out of 150 developers.
I'd say with some confidence, that the older developers in my company complete as many "tasks" but write less code when doing it, with a lower defect rate.
That 80/20 split (for younger me) is what taught me enough that I can now think it all out ahead of time. When I just started around 20 years ago, it was all new, it was all confusing, and all the answers were hidden behind obscurity, gatekeeping, and strange social norms (no StackOverflow back then). So I had to learn by brute force, smashing my face into every project for somewhere around 100 hours a week.
Now I have some knowledge and perspective, even the ability to pick out the fads and novelties from time to time. I can try new things on small projects and I can go with tried and true for the big things and I tend to understand which projects are better for which approach. I can visualize the data, the models, the inputs and outputs, and think through the logic from beginning to end, all within about 30-50 hours per week. That took lots of late nights and a ton of trial and error.
There's a good place for the less experienced and more experienced on any well balanced team, to be certain.
I’m in my late 30’s and almost all of my good friends with a CS/BIT degree has cycled into management. The interesting thing I’m noticing now is that they all have ideas for projects but zero capacity to build anything because they’ve lost their building muscles. They also seem reluctant to really want to learn anything else which is antithetical to software engineering where a new framework is out every day.
It's not about losing the building muscle. It's about losing the motivation to do so in tech companies. After many years of being IC, things can get boring. You keep doing the same things, even if you move companies and technologies change. And you always do it in a sub-optimal environment - things like working with tooling you don't like, under managers or with co-workers you don't really like, on projects you're not super excited about, etc. So moving to management is almost inevitable. Not to mention there are other reasons such as better salary and more visibility to change things. Also, people need change in their life. Moving to management is one way to change things. Some people go back to being IC, but most don't.
You know what's funny. I would not call the environment you're describing as a tech company. I would call what you're talking about a tech-enabled company. If you are working at a tech-enabled company, then I 100% agree with you about moving to management becoming inevitable. Tech-enabled IC's are basically rendering JSON/SQL/NoSQL in on Desktop/Web/Mobile. If you're company is just spinning up .Net Core/Node/Django/fill-in-framework-here to build a website to show some data, that's not a tech company to me.
When I say tech company, I'm talking about companies where software engineering skills actually matter. where a O(n²) algorithm also will cost your business. In my definition of a tech company there are tons of new things happening. Look at the spaces like AR, self-driving, rocketry, and machine-learning, and computational photography — no one in any of those fields is doing the same thing they were doing even 2 years ago.
I dunno man, this sounds like some unnecessary exclusionary gate keeping on the really broad term of “tech company”. Instead of trying to take this broad term and scope it down for your own purposes why not use a narrow term for your narrow definition. This is like when CS graduates claim people without a CS degree aren’t software engineers and shouldn’t be hired into the same roles... to render JSON on a mobile device.
You're absolutely correct. That's why if I applied to work at any of those companies (I've already worked at Amazon) I would be extremely particular about what project I work on.
Yes, don't think that is wrong though. There are very few teams even in the FAANGs that are working on interesting problems. So instead of talking about tech companies, we should be talking about tech teams.
Then you have your own special definition. A "tech company" is a company whose primary business is tech, including companies pushing out boring line-of-business applications like you describe. It is in contrast to companies whose primary focus is something else e.g. agriculture or pharmaceuticals.
I loathe being in management and am totally aware that I'm probably a bad manager, so I've got a lot of respect for engineers who wind up being put in management roles, but the one thing that keeps pushing me to want to go back into management is getting put into a team with an abysmal or completely missing culture that I feel like mostly stems from terrible leadership.
I'm finally to the point where I'm confident enough with my career to actually make noise and complain about things that I'm tired of dealing with which I wasn't in my 20s, but now I'm dealing with my complaints not causing any change in the end.
This is the thought i've been obsessed with last 6 months and after evaluating everything think almost everyone is better off going to management in late 30's or 40's. There is no way out of it.
Some of the reasons
1. boredom of doing the same thing over and over in a different languange/framework
2. Still have to get "permission" from (possibly younger) manager before doing anything.
3. Have to leetcode after work to switch jobs.
4. Have to keep learning latest framework after work.
5. In direct competition with new comers who are much hungrier and with ppl who don't have many personal responsibilities.
6. Honestly, it feels a bit weird to be the oldest person on the team by a huge margin.
7. Younger devs assume you might be bad at your job to not grow in your career and don't give you much respect.
This is kind of what drives me nuts about software engineering. I would love to work a 9 - 5 but in software you have to constantly be learning, it ends up being a 50+ hour a week career. It does pay well which is great so I have little room to complain but I feel like I am always at work.
yeah exactly. Even if you manage your time well to take care of personal responsibilities, you always have a nagging feeling that you are falling behind.
Have they lost their building muscles, or are they just not building what you want how you think they should? At 55, I'm not as prolific as I used to be, but it's not because my skills have atrophied. It's partly because I tend to work on the bits everyone else is avoiding (often because they're difficult), partly because I try to do things right instead of hacking and slashing, partly because I find it hard to concentrate on doing things within the absolutely insane structures and idioms my younger coworkers have created.
Example: the almost universally used service infra where I work is a nightmare of excessive context switches and tuning to avoid starvation/deadlocks. Why? Because the kidz who developed it apparently didn't read enough to know that the basic paradigm it's based on is known to have such problems. The people around me think this is normal or inevitable, and just live with it, but even the person who did most to popularize these ideas recanted a decade ago. Too bad; we're just stuck with it, because it's the young folk who refuse to learn.
Your older coworkers probably haven't lost their "building muscles" and aren't reluctant to learning anything. They're reluctant to repeat or build on past mistakes. Overall, your comment seems like a good example of how older programmers are often misrepresented by those who don't share their experience. Let people represent themselves.
> which is antithetical to software engineering where a new framework is out every day.
... In some fields of software engineering!
There is a whole world of Software Engineering outside Web/Infrastructure. There are plenty of Software Engineering fields out there where slow changing standards and toolchain reliability are considered a valued feature.
That said I don't disagree with the need for self-study in Software Engineering, I just disagree that this need originates from some issue with the high churning of languages/tools/frameworks(which is local to Web/Infrastructure)
Even in the web, it doesn't really change as fast as people say. There have been 3 front-end frameworks that actually matter for 6 years. Have there been new backend frameworks constantly coming out? where does this meme come from?
Hmm, I'm in my mid 30s (I'm still pretending to be 34 on the basis that my birthday was in May, which I contend hasn't _really_ happened yet due to the pandemic lockdown; it's still March), and fewer of my peers than I'd have necessarily expected a decade ago have gone into management; the "senior/lead/principal engineer" route seems more common.
As an older developer (46), I stay away from the clusterf%%% of modern front end development.
In the last three or four years, I’ve been going back and forth between C# (.Net Core) using Visual Studio and Javascript (Node), Python, and Go using VS Code. The setup was. 1. Download VS/VS Code 2. Install appropriate extensions .
It makes no sense to me to try to stay up to date with front end development when front end developers are rapidly becoming a commodity with a bunch of boot camp grads. The money isn’t worth it.
I'm the same age. But I am also guilty of old habits when attempting new frameworks or languages. For example, I always want to set up the environment with the least amount of "auto loads" of libraries etc. I do not use NPM. I still play around with new JS libraries by direct downloads from the official websites or Github and always use local copies of .js .css etc. I don't even connect to CDNs for fonts! I want to know all dependencies etc. I am paranoid like that. And this can be exhausting.
Can't agree more. Recently started a personal project in .Net Core + Visual Studio. What a joy. Can't get away from the front-end completely, but I can at least enjoy the backend side immensely.
Have you tried Blazor? I wouldn’t invest time in it personally because I don’t see a market for it, but if I were just doing a side project for the project itself and not to learn a new marketable skill, I would give it a go.
I'm in my mid-20's and everyone my age with a BIT/CIS/%Information% degree looks down on actual coding. They all act like it's way beneath them and focus on some hand-wavy consulting stuff. I work with a bunch of people with this degree and I genuinely couldn't tell you what they do other than add connections on Linkedin.
I'm in my mind 30's and I'm becoming more reluctant to learning new things.
In my 20's I used to enjoy coding just for coding itself. I was always excited to learn a new language, a new library/framework/etc.
When I reached 30-ish, I was more interested in building things. The "coding" part was more of a chore to be honest, I just liked doing it because there was a product to build to solve problems. And the thing is, I can probably build any product with the languages I know today. So many of these "hot new things" are just rehashing ideas that have been around for decades.
It's interesting to see multiple people in their 30s here mentioning the decline in interest of learning. I'm young 30s and I'm literally quitting my job because I can't do it anymore - I can't be bothered to learn new stuff I need for a new project. I'd literally rather quit. I feel so broken because of it and I know my next job is going to be a third of what I'm paid now because it won't be software.
Don't hear me wrong, I should have been more specific, but the reluctance to learning is specific to tech and programming. I learn a lot of new things during my spare time. Even in the field of pure computer science there are a lot of interesting programming theories to learn.
After decades in the field, I feel like I'm mostly always doing the same thing. Grabbing data from here and there, making sure I rate limit and fail gracefully, serialize shit and deserialize a response, or vice-versa. Writing software is just plumbing and it's infuriating when people just keep changing the size and shape of the pipes just for the sake of it. That's why to me the actual product being built is more important than playing with tech just to play with tech.
I went to school for CS and ended up being a data janitor. I get paid FU money to do what is essentially ETL in many fancy and various forms. All the excitement is gone.
That's funny, I've been thinking about getting into indoor/outdoor painting. I don't know if I'm really serious about it because it would probably divide my current salary by at least 4 or 5, but maybe we need to start a business lol
Are you dealing with anxiety from covid/quarantine/everything right now? I think a lot of us are in this same boat, want to get out of tech, etc. I remember what it was like making a $45k paycheck though. I had such little agency in life. Obviously people can get by on that and do much better than I did but I've gotten so accustomed to having the freedom to basically buy anything I want (obviously in reason) when I want. I don't even flinch at a $150/mo gym membership which theres no way I could afford back then. I remember wanting to learn judo and kendo when I was around 23 and being shocked at how expensive it was and I couldn't do it until I was in my late 20s.
Do I need any of that or is it worth it? Surely not. But the freedom I feel now making 3x what I did when I worked in retail is something I can't really put into words well.
I agree that going back to $50k a year is going to blow chunks. But it's going to be very different this time versus when I was straight out of college - I have developer skills I can fall back on in the worst case scenario, I have six figures in the bank to fall back on, my student loans and car are both paid off, I'm not moving around constantly, I'm not spending money on expensive dating. I think I will be ok. I've never been an extravagant spender on myself, only on others.
Well I'm envious. I want to pull the trigger with 3 months of savings in my account and its terrifying. I absolutely loathe the people I work for right now and how they're treating their employees during COVID. I'm just really worried with the way things are right now I won't actually find a paycheck to take home in those 3 months even with my highly sought after skill. That and I don't have the energy to prepare for FAANG interviews and whiteboarding right now, I feel stuck and I hate this anxiety.
Sorry if this is a downer, but honestly even if I wanted a new job immediately I think it would take many many months to find one. I'm planning on it taking up to a year. I've been looking at job boards a lot and even for developer jobs, they're really only hiring senior level positions right now and I'm sure the rare non-senior position is extremely competitive. My last two roles have both been senior, but I would very much be junior for anything outside web development (which I no longer want to do).
Life happens in your 30s. Meaning you get a spouse, house, and kids. Suddenly work becomes less of a priority and even if it doesn't you have a lot less time and energy to dedicate to it.
I also think there's a frustration limit. Like if your spouse is being a jerk, the kids are being dicks, and the plumbing broke you don't want to mess around with a new technology. You want something easy and easy is what you already know.
>I feel so broken because of it and I know my next job is going to be a third of what I'm paid now because it won't be software.
I would not do that. You're probably just burned out. Take some time off and switch jobs before switching careers. Especially one that will pay 1/3 of what you make now. At least start saving up 2/3s of your salary for a while to see what it'd be like.
The problem is that I've done that before. I took a year off not working at all, just traveled and fucked around, and then got my current job. Also my partner is great and I have no kids or really any problems at all in home life. I've only lasted at my current job for a year (already gave notice). Granted my current job is very similar to my last one - maybe that's the problem.
I'm currently saving about half of what I make. Finances will be easier when I'm not the only one with an income and my partner is days away from a very probable job offer.
I'm wanting to transition to project management and eventually product management. Product management can pay pretty well ($150k+) once you're established. I can afford to not really save much for 5 years until I'm back into six figure income.
Maybe you're just in a shitty job? I thought I was burned out on programming a few years ago, turns out I was just burned out on a specific stressful job.
I mean shitty is relative. Compared to manual labor jobs I'm sure my current job would be a blessing. Six figures, sit around all day, job is easy if I had the mental fortitude to force myself to do it, I can get away with only working maybe 20 hours a week.
It's shitty because it doesn't advance me at all, there's no room for growth, no opportunity for learning something that would be meaningful at literally any other job, there is zero social aspect to it since I'm 100% remote and work on projects alone, all my coworkers are on the opposite coast of the country.
I have tried using my free time to get more familiar with ML/AI stuff but my brain just shuts down as soon as I begin to try. I would love, in theory, to transition towards AI work but the learning curve feels so steep. But the pay would be great, I think there is an absolute shit ton up upward room to grow, I could probably work on some pretty interesting problems (though I'm sure there's tons of "make ads more effective" ML jobs out there too). Maybe if I take some time off work I can try to get back into it, spend 6 months learning, and try to get an entry level ML job.
Does either camp have second thoughts about how their careers have progressed? You've highlighted one or two downsides for those who've gone down (or should I say 'up') the management route. Would you trade places with any of those friends? Would either of them trade places with you? Do you have an 'exit' plan out of software engineering or do you plan to stick with it to the very end? I appreciate some of these are personal questions and you may not want to discuss it but these are the perspectives that interest me.
In 2008, I was 35 and had let my career, skills and salary stagnate. I had been a company for almost a decade mostly writing a combination of C, C++, Perl, and VB6 programs for backend processes. I finally woke up, did a career reset and pivoted toward “enterprise development”.
Fast forward to 2016, I was married, with a step son who was a freshman, tired of working on yet another software as a service CRUD app at my 3rd job since 2008 as an IC, and jumped on an opportunity to be a dev lead at a medium size non software company.
I thought the next step was to either stay a hands on dev lead/ “architect” and just muddle along for the next 20 years, go into management, or go the r/cscareerquestions route and “learn leetCode and work for a FAANG” and move to the west coast.
Neither sounded appealing. Then management decided to “move to the cloud”. I didn’t know anything about AWS at the time and saw how much the “consultants” were making and that opened my eyes. If these old school netops folks could pass one certification, click around in the console and make. $200K+ a year, imagine what I could do if I knew AWS from the infrastructure and dev ops side and I knew how to develop and architect using all of the AWS fiddly bits.
It took three years and teo job changes in between, but I really like consulting. It’s the perfect combination of development, high level architecture, customer engagement and you never know what you will be doing in three months - or in what language.
The company I worked for as a dev lead was acquired by private equity and by the time I had any knowledge about AWS the infrastructure gatekeepers and consultants took over.
I started looking for a job and got lucky that another company was trying to build an in house development department led by a new CTO. They had outsourced all of the development before.
The new CTO was very forward looking and wanted to make the company “cloud native” and improve the processes. He only had a high level understanding of AWS as did I. He took a chance on me and I became both the de facto “cloud architect” and the person he called when he wanted a customer facing project done from the ground up without having to deal with the slow moving “scrum process”.
I was quite happy at the company and would have stayed a couple of years probably even knowing I could make more money somewhere else and then Covid hit along with an across the board pay cut.
I was still not really looking, a 10% pay cut at a time when we couldn’t travel or really go out was an inconvenience but not earth shattering.
Then a recruiter contacted me for a software development position at Amazon. I wasn’t willing to relocate or do the leetCode monkey dance but we talked a little and then she forwarded my information to a recruiter on the AWS side.
I saw the interview process was basically a high level technical interview to determine whether I knew the basics of AWS (I did) and all about the Leadership Principles. I knew I could answer the “tell me about a time when...” questions with the best of them and the interview process was going to be fully remote.
To keep a long story from getting longer - I work at Amazon as an AWS Consultant from the comfort of my own home in the suburbs in a low cost of living area.
> Does either camp have second thoughts about how their careers have progressed?
Anecdotally from our conversations, I would say that they see computer programming as being less valuable than people programming. My friends do have lots of ideas for things and one in particular will keep saying that he wants to "re-learn iOS dev" or "learn Elixir", but he never does. I've started down the path of learning how to angel invest, which is where I'm trying to learn my people management skills.
> Would you trade places with any of those friends?
Absolutely not.
> Would either of them trade places with you?
I doubt it.
> Do you have an 'exit' plan out of software engineering or do you plan to stick with it to the very end?
I'm already on a trajectory where I won't need to work a traditional job the rest of my life. None of my friends who are climbing the management chain are anywhere near that. I honestly love building and learning (I just finished up a 3.5 day hack-a-thon yesterday). My next path will lead me either to building a company, helping people start tech-enabled companies or helping someone co-found a company. Unless I become a CEO, I don't plan to stop programming.
As with my age,I'm in the middle of both worlds too: I'm a manager but I also do development whenever that's needed ( it's not a tech company). Development,while can be frustrating,is very enjoyable,as you feel that you creating something tangible and useful. The management side is never ending issues. More often than not I have to operate without having a full picture,so arbitrary decision. You also get to be the person who makes decisions for people when they don't want to do it. From the wider business perspective it's quite interesting,as you get to know a lot what's going on in the company. I'd love to do development full time, maybe in a smaller company rather than some big corp environment. Ideally,I'd like to do 10-15 years of development and only then go into management,but not sure how viable it can be.
> They also seem reluctant to really want to learn anything else which is antithetical to software engineering where a new framework is out every day.
Actually I'd say that's a sign of maturity. They've grown wise to the idea that our industry has a fetish for reinventing the wheel and refuse to take part in it.
"which is antithetical to software engineering where a new framework is out every day."
Hey! Objection!
I'm a software engineer and I have no need to keep up with random web frameworks popping up left and right. I'm fine writing my C++ and C#, thank you. I have to keep up to date but the stack evolves over a decade, not every quarter (or what ever the cadence is for web stuff).
Become an expert in a field and then it's irrelevant if you know framework xyz or not - it's no longer a critical requirment. It's critical you have domain knowledge, and what ever the tech stack is it's expected you can get up to speed in it just fine on the job.
I've come to the conclusion that tech is one of the least meritocratic industries in our economy. As a result of this, many older workers who "didn't make it" are deeply frustrated by the increasing gap between their ever improving skills and their ever decreasing remuneration. This frustration with the system is also what makes them undesirable to hire. They don't believe in the meritocratic fiction so if you hire them, they may cause cultural problems and dissonance inside the company (and they may come across as apathetic).
To make matters worse, those who succeeded have an incentive to strongly believe that the tech industry is a perfect meritocracy, so they feel like something must be wrong with these older people and this is why they still have to work as developers and didn't progress in their careers.
The affront which reality may pose to their egos, causes employers to hire inexperienced, starry-eyed developers on huge salaries who create chaos and busy-work for themselves whilst yielding diminishing returns and barely getting things done. But the monopolistic, winner-takes-it-all, easy money situation of this industry allows employers in many sub-sectors to make these terrible decisions whilst continuing to thrive and this allows them to maintain and even reinforce their false ideologies about youth and productivity in a vicious cycle of increasing mediocrity.
Older programmers need basically no hand-holding/guardrails. That alone can make them much more valuable from a management perspective. Here is the project get it done and its done.
Younger programmers sure they might work 16 hr days and weekends. However if I'm having to check that hey are not running wild in left field for 2 hrs everyday its a huge time hit to productivity and just increases the more of them there are. I don't know that I'd want to work someplace that thinks butt in seat is a measure of productivity.
To me a job that seems to be appealing to the younger crowd just means the job will be a miserable no life balance grind. There are plenty of jobs out there that just value nothing ever breaking that I'd rather work at then some flavor of the week startup.
I'm working in an outsourcing company on a project which makes use of all kinds of engineers, not the homogeneous programmers working on the same stack in the same language, but actually a lot of diverse tasks - some work on analog hardware, some on digital hardware, some people specialize in controllers A, others in B, some specialists in video, some "regular" application programmers, and the list goes on. And most of engineers are in two offices - one Ukraine, another in Israel.
So this very long preface was about age of our employees - despite working on the exact same problems office in Ukraine almost never hires "older" engineers of any specialization. By "older" I mean probably 50+, maybe even 45+. From the same general space where I'm sitting the oldest I know for sure is 40 years old and he is not a recent hire, so he was hired when he just passed 30 years.
In the Israel on the other hand there are a lot of "older" engineers by share, and they are getting hired being "older", hired because of their experience in all kinds of domains we are working on.
Just wanted to share how this problem looks like in two different countries.
So, number of software developers doubles roughly about every 5 years. So if you have been in the field for 5 years, half of the developers will be younger than you, if you were in the field for 10 years, 3/4 will be younger than you. You can do the rest of the math.
I've been working as a developer for 40+ years, so most other developers are younger than me. Although, on my current team there are two people older than me.
Against this background, we argue that preserving employability cannot and should not solely be the burden of the developers themselves. Companies should take their share of responsibility, encourage practices that welcome developers of any age and curb those negatively affecting older developers.
Ok I don't disagree. But how is such a conclusion justified by googling a "age software developer" and seeing what pops up? Paper seems pretty thin.
I can prove it, with a huge portfolio of stuff I was working on fifteen minutes ago.
I stopped looking for work some time ago.
Let's just say that I have the skillset and chops that many startups would kill for, but I don't have a...how do they say it, these days...gosh darn it...I just can't keep up with the stuff they say...oh yeah.."cultural fit." Yeah, that's it.
I have to say a lot of technology "change" is fads. The New Thing takes over before anybody seriously vets the new idea. Once something looks like "legacy-ware", everyone gets scared of being obsolete and abandons it for the new kidware on the block.
Experiments are nice, but making everyone in IT be a guinea pig is not productive. Rapid change for the sake of change alone will probably "benefit" younger workers on average. Fashion driven IT is probably not good for organizations in general, not for just older workers, because they are paying to throw out software and start over too often; but it seems nobody can stop this Sisyphusian dance.
This thesis makes a heavy case for plastic surgery to appear younger to keep a tech job - Has anyone reading this ever even considered plastic surgery to get a competitive advantage against ageism?
For cultural context, in some societies plastic surgery is seen very differently than it is in the United States, as I understand it. So, there's precedent for this internationally.
In South Korea, for example [0], there is a long and interesting tradition of cosmetic plastic surgery, such that "standard" procedures (such as the one you suggest) are kind of expected at a certain level, in the same way that relatively standard, elective dental procedures (straightening, whitening, veneering) are expected in the US.
To make a crude and probably-flawed analogy, it's seen as an investment in your image; kind of like painting your house, cutting your hair, or purchasing designer clothes.
This all being said, I personally don't think this will penetrate the software engineering market too much. IME, if a company is practicing the type of superficially-applied, ageism-based hiring discrimination (the kind that could be gamed with cosmetic surgery), it's a red flag that they're not a serious software shop.
I don't know if you want to invest resources in improving your odds at a game that you don't want to be playing. There are much better games at the casino.
Seriously, the youth cult is concerning; I can say my best customer projects were without doubt those with a mixed team, both in terms of age and gender; mixed teams help to ground the project, staying focussed, and not becoming a nerd pissing contest.
Not as permanent, but my dad says he notices that clients listen to his suggestions to update 10-year-old network equipment a lot more readily when he dyes the gray out of his hair. He first did it for a high school class reunion, and has kept doing it since.
Wow, that is a sad state of affairs. I like seeing grey hair and find a bit awkward when men dye their hair, I see it as a desperation to stay young or fear of aging or lack of development in other areas and holding on to young looks as the only achievable thing. But it makes sense that your dead does it for the job, namely to be taken seriously. But I agree with other posters, keeping fit goes a long way and is something most people could and should do.
My wife in her late 30s stopped dying her hair since the pandemic started and she's got a lot of gray hair, especially compared to me who i have none. She remembers having it since she was 18 but kept on dying it afterwards. I find her looking great with gray hair. I have to add that she is in a great shape and that compensates for the gray hair.
If women are okay with gray hair so should men be. I don't think hiding the gray hair does any good, the reasons for not hiring older people from what I understand the preference for younger culture because they don't have experience and are gullible and also the pay is lower. Dying one's hair ain't gonna solve those requirements by the hiring companies. And in addition, some people start graying as early as their 20s and never bother to dye it.
We're talking about physical perception, that's why people go out of their way to get plastic surgery, braces, wear makeup, etc, to impress other people. You can say how things should be, and what the ideal world should be. But the fact remains, physical appearance helps immensely, whether that person is qualified or not. Why do people put on a suit for a silly interview? Why can't I just come in with flip flops and shorts? Yeh the world sucks, you play the game in a way that gives yourself the best chance to win.
57 year old software dev here. It does definitely get tougher to find and keep work as you get into your 50s. What I've found is that it's a lot easier to get a job at a company that's run by people my age or older. I've been at a few startups over the last several years where the CEOs or managers were around 60. The interviews are a lot easier - kind of like interviews were back in the 80s & 90s probably because those are the types of interviews they remember as well. Being of a similar age they also tend to figure that if you made it this far you're probably going to be a good worker. I've had way more autonomy in those situations than I've had at other companies where I had 30-something-year-old managers. Maybe I'm falling into some reverse-ageism here, but in my experience I'd rather work for someone closer to my own age or even older. Problem is that there are getting to be fewer and fewer of these options as people tend to retire in their 60s.
Also I think that a big problem is our health care system. I may be wrong on this, but I imagine it costs a company more money to employ an older person who may actually use their health insurance or have health problems.
Removing the burden of health care from companies would maybe solve a bunch of these problems and allow companies to risk hiring older people.
That's probably accurate, but if 20 people apply for a position and one gets hired, is it not accurate that nobody is scrutinizing some list of 19 reasons why 19 people didn't get the position?
These kinds of things seem like there is no way to enforce them.
I don't even know if that's a thing, health insurance being more expensive for companies with older employees or whatever, but just in general. "You can't fail to hire such and such type of person because it's illegal," doesn't seem to actually have much practical ability to influence a hiring process if it wants strongly enough to not be influenced.
I haven't done any research on this and could be completely off base. Just have the same thought when I see this mentioned in general and started typing this time.
I don't disagree with the general sentiment, but I don't believe the reasoning is actually true. Generally companies pool their employees into a risk pool. Smaller ones will find a broker that does it, larger ones probably do it with their company population. But in general, the risk gets spread out enough that a couple individuals do not cause insurance to spike.
I could be wrong also, not an expert by any means, but that is basically what our HR rep told me when I inquired (not because of age but because my wife is high-risk/high-cost, depending how you look at it). I was told that was not part of the equation because of how it was setup. It could if it was a small enough business that tried to negotiate something themselves, maybe?
But we're not talking about a couple people, we're talking about a culture shift to prevent ageism. In that context, the average age definitely increases, which definitely increases the risk and premiums for that pool, what was the parent post's exact point.
It's easier to lead younger developers as they're raw and don't have experience to call out their management chain. That's something older developers do as they've seen it all before and know how they latest trend ends. Management generally hates that.
On the flip side, more experienced developers (nevermind age) might have used that experience to become set in their ways. Younger developers might be more moldable
This is a good question to ask regardless of your career. It's a combination of factors: specialization narrowing the range of possible jobs, experience tending to command a higher salary, and flat-out age discrimination.
You will likely have peak earning years, and that means earnings will decline post-peak. For most people this is true, not everybody.
If you know this is a likely pattern, you can do something about it. Namely, live below your means, invest, bank raises, keep a lid on lifestyle inflation, and accept that some day you will take a job that pays less than your previous job and that this change should not be ruinous.
By then it's already too late. You should have an exit strategy when bringing the money in in the early years. Most folk in their twenties don't realize they're forty before they know it.
agree. max your 401k and buy Index fund if you can. plan your retirement early or save up your money and moved to developing country like Thailand, Taiwan or any safer developing countries with low cost of living and just enjoy your life.
my plan is retired at around 50s in Asia and just travel around Asia and do some freelance works.
Im wondering too if I'll be able to find work when I'll be in my 50s, im now in my early 40s. I am hoping I will find some even if I'll have to lower my salary expectations which is kindof okay, especially if it's work that i'll enjoy.
5 years of X just says that the job market can sustain some number of requests for candidates that don’t require as much ramp-up time once hired. It doesn’t mean the candidates will be old, in fact looking for highly specialized tech stacks rather than experience in the business domain of the product is a sign of someone looking to hire young.
1) Those who believe in age stereotypes
2) Those who don't
There isn't some cliff you fall off of and become, "too old." That is one of the central lies of ageism. You're suddenly too old to be cool, you're too old to learn new things, you're too old to start something new, you're too old, too old, too old.
So get it in now, while you're young! Work hard, put your head down because once you're over that cliff it's done! You'll be too old to be a programmer anymore! You'll be stuck in your ways, a crufty old fossil, someone who only shows up and doesn't learn anything new.
I'm getting older. I'm picking up skateboarding. In recent years I've learned how to formally verify systems using model checkers and theorem provers. I've picked up on category theory and abstract algebra. I started streaming my side project to learn how to implement a database from scratch in Haskell.
You'd think that I should either go into management or find another career. This is a young person's game! You have to keep up with all the new technology! Old people can't keep up!
The lambda calculus turned 84 this year. Binary trees have not changed much. There was a paper released eight years ago that showed us how to implement co-ordination free operations on B+-Trees. Hooray.
The real innovation still happens in small steps at the periphery.
Ageism makes us believe that our best years are short and fleeting. That all of our work will be behind us for the majority of our lives. To talk about expertise as a deficiency and experience as useless.
One thing that older engineers often have that younger don't is family commitments. I am in my 40's with all the things that come along with that, wife, kids, expectation of personal time, etc. With Covid hitting, my productivity has plunged. I have young kids at home that I have to tutor, my wife is working from home as well so constantly getting distracted by her work calls. Kids stress both of us out, dogs barking etc. Its almost impossible to get any deep work done. I work with a bunch of either younger people without families or people my age that are childless. They have not been affected at all. Employers very often want someone that will live to work for the company, many of the middle aged engineers cannot / will not do that.
Just remember everyone... a majority of the people who respond to surveys like these are people who are unemployed right or have a gripe with how their life is. very rarely do you get the person who is doing good and happy responding to surveys.
What survey? This is a research paper, and from the research paper:
"We start by qualitatively analyzing 24 popular online articles. We contextualize our findings using
scientific literature on ageing in software development and a follow-up study investigating discussions
about the above-mentioned articles on Hacker News"
But quite often that is valid. Any job can only pay as much as the job is worth.
If a more experienced developer who demands a higher salary is taking on work which can be carried out by a less experienced and therefore cheaper developer, it's to be expected that they wouldn't pay for the more expensive developer.
Just because someone is a world class michelin star chef, you're still only going to pay them mcdonalds wages for flipping burgers
Right, but if you give someone the bracket, they say they're fine with it, they interview well and then HR says "it's too expensive" and you're billing yourself as a youthful fancy startup... then reading between the lines is a skill.
While it's certainly not the case everywhere, the truth is that a lot of software development needs out there aren't terribly challenging to get to a "good enough" state from a business value perspective. This is exacerbated by the hyper-specialization of shallow roles where companies are basically looking for technologists. Once upon a time people used to hire software developers expecting them to figure out how to solve problems across almost all software layers and to take their time doing it. Now many positions are for churning out work with a specific library or framework in a specific language. It doesn't take a long time for there to be diminishing returns in terms of the worth of each extra year of experience when someone's role is to work as a "React developer" or something like that especially if someone else is responsible for making the designs and another person for setting up the backend or if the bulk of the software can effectively be outsourced by leaning on open source tech.
Factors like flexibility in work hour expectations and compensation favor younger developers who tend to have fewer responsibilities outside of work but have enough specialized skills to get the grunt work done and are able to put in the hours to make up for their lack of experience. Generally younger developers are not as well suited for making lots of bigger architectural decisions since they don't have the experience to really grasp longterm consequences of different approaches, but in terms of numbers, you need fewer people making those decisions than those doing the grunt work and young people with just a few years of experience often seem to do the grunt work about as well as anyone.
This wouldn't be such a big problem for older software developers if there wasn't a pay expectation gap or if they were as likely to be flexible about hours. Rightly or wrongly, however, people expect higher compensation for their experience, even if much of that experience isn't actually producing that much more business value given the sort of work that they are actually stuck doing. The world definitely has a place for the John Carmacks, but it's really not that large a percentage of jobs where they can make the most of the value they have to offer.
I do wonder if there's some conflation of age and institutionalisation; as someone in mid 30s who knows plenty of people in the industry a decade or so older, I'm just not seeing the "life ends at 40" thing that many commenters seem to imply.
However, someone who's worked for 20 years at the same place (especially if it's a place where things are done A Certain Way, and anything else is heresy) may well have trouble moving job. And such people will of necessity be older. I suspect that a lot of the perceived problems for older people moving jobs are really more about institutionalised people moving jobs.
Also an older dev (56) and I agree with other's comments that learning the myriad FE frameworks feels like chasing my tail.
I've carved out a role where I work primarily on the back-end of my company's platform, about 50% coding, about 25% mentoring/coaching, and the rest is working with our product managers and business users to develop new functionality.
I feel valued and feel like I'm making an important contribution to the company.
But I am concerned that one of these days I'll no longer have the cognitive muscle to do all of this.
As an 18 year old, I’ve learned so much in the past few years about developing good software it’s mind-boggling. I can’t imagine what 10, 15, 20 years of experience would look like.
What you learn over those years is incredibly helpful, as it takes a lot of breadth/depth and time. But the rate at which you learn things goes down.
Similar to learning a foreign language. At first you are learning new words everyday. At some point, you know all the common words, so learning new words is impossible at the same rate as before.
Congratulations on having already couple of years of experience at your age. I've been on the same boat 10 years ago.
As for 10, 15, 20 years of experience: As long as you make sure it's actual 10 years instead of 1 year repeated 10 times you mostly gain awareness and intuition for engineering on a higher level of abstraction as well as the place of technology in a broader business context.
Ruby, Django, MVC, Elixir, TS, React, Vue, functional programming, immutability... specific things, tools, technologies, approaches are becoming less important. The experience is getting transferable and you learn when it makes sense to code the Rails way in Django.
Slowly "I'm going to learn React/Vue so I can do front-end" will become "I have to do front-end so I'm going to use React/Vue". When choosing technology you'll realize that the "job" in "finding the best for the job" is not just delivering the solution. It's making sure the solution is easily maintainable for years. In that timescale stack homogeneity, hiring strategy etc sometimes become even more important than the tech fit. You might think serverless is the way to go for your new task. But is it really worth to introduce go lambdas to the system when you already have 20 ruby devs?
I recommend early switch from "I'm an engineer, let me know what to code" to "I'm here to grow the product/business, let me see what I can do". No one pays for delivering code, companies pay to to solve business problems. The problems you solve don't have to be contained to those that come with your job description and the solutions doesn't have involve technology. No matter the aspect of the inner workings of a company (or life in general), everything can always be improved. Identify the things, suggest/provide solutions and you'll be both satisfied and rewarded.
You learn how to deal with people, developing taste and instincts about problems and approaches, and have the time to deep dive into a few specialized areas.
When I was 18 I was coding C and learning Perl. I was writing scripts to build my website from a bunch of text files, ordered by their creation date from most recent. I was hacking on BBS door games and getting into MUDs. I didn't know what was impossible or hard. I just threw myself into anything and believed that I was unique, working on the frontier, that I was part of something.
As I became older I started to get smarter. I realized that most things have been done before and that progress came in small increments. In waves I came crashing upon the shores of self-doubt; each successive tide washing away hubris and pride, leaving behind the rocks and sand. I realized how little I knew.
I don't think we become fossils. Erosion moves the shores and the process changes us, our shape, and our character. What we value may change. What skills we need to achieve our goals becomes apparent. What knowledge and ways of thinking are ultimately valuable prove themselves with time.
Youth is an important time in our life. The hubris and arrogance are useful to us in that phase. But age doesn't ossify us or make us irrelevant. We don't crystallize and become lost in time. You don't wander over some precipice and become, "too old."
That's the great lie told to youth. It's a great time in your life but it's not "the best time in your life." Who knows what will come next?
I look back on some of the code I wrote in my youth, my writing, my thoughts... I'm much different now. I'm satisfied with how I've progressed in most areas and feel like I could use some work in others. Life isn't over until it's over. Some people get their PhD's in their 50's. Some people do their best work early in life. It's not over until it's over.
So enjoy it!
update: ultimately the timeless stuff tends to be the less trendy stuff: maths. learning how to think abstractly in precise terms goes a lot farther than learning a particular framework or technology. People skills: how to get people to work together on problems. Business skills: how your work fits in the broader context of society and the economy. All of these things are topics that can go deeeeeeeeeeeep. It takes time to learn all of this stuff. The worst thing is realizing that you won't live long enough to dive into it all.
I'm 43, with 20 years of solid experience in most major languages out there, from backend to frontend.
I guess the virus is partly to blame, but I'm having a REALLY hard time finding work right now; and I'm applying all over Europe with a focus on Amsterdam & Copenhagen, remote or otherwise.
I suggest switching to consultancy/freelance work. The interview process nowadays in Europe is just a time waster.
Before switching to consultancy, I've wasted almost half a year applying to various jobs. Multiple rounds of interviews, coding challenges, specialized resumes and everything in between. Average response time from recruiters was 2 days to 1 week.
Took me less than two weeks to get my first client as an independent contractor on a highly competitive freelance platform.
Also, don't hesitate to sign up with recruitment agencies. I was quite pleased with the quality of forwarded jobs from recruitment agencies.
All these articles about the market forget to mention that tech jobs are hyper concentrated in a handful of city around the world (silicon valley, new york, london).
The rest of the world is struggling to get any job. It's not unheard of to have one hundred developers for one role outside of the major capitals.
Why would American companies hire older European developers when we aren't even certain that American companies have any appetite for older American developers?
Have you considered working in the US? Seems like we're perpetually in a shortage of good web devs (e.g. can do more than paste together frameworks and Wordpress sites)
Honestly, you are too old in European market. There are no influential technical companies in Europe, no place to accommodate old people like you and me. I am couple years younger than you, but still oldest after my manager in the department. I tried to write few applications before coronavirus, had couple interviews and saw, that many places hire only young guys with couple years experience. Remote work wasn’t even a topic back then.
Real world IT services seem to be dominated by middle aged persons. Accenture, Dell EMC, HPE, RedHat, etc. Particularly integration roles in gov, health, banks. It takes decades to build contacts. And most folks seem to know each other from way back and have similar war stories ;)
The cognitive dissonance of the tech industry is that many companies do in fact value their longtime employees, but they won't hire any new employees of the same age.
You can be old in tech, as long as you were hired when you were young.
They list documents stating 30 year old developers as old ... are you kidding me? At 30, developers are just starting to understand the job (like all of the non-coding skills).
Question - why is it that experience / age / wisdom is an asset in other professional trades [1], but it is seen s as a liability in tech?
I have a couple of theoretical answers.
One, perception of tech workers being semi-skilled labor (like low level clerical work) lending it to being managed with "cost center" economics. Middle managers from baby boomer and gen x grew up with idea that tech can easily be "off-shored." The perception is that tech work is a necessary cost, not a strategic element of pretty much all businesses. However, the next generation of middle managers that grew up not knowing what life was like with out the internet will recognize how important tech is to the business and will shift the corporate attitude towards tech work.[2]
Two (this one will be a lot more upsetting), tech workers were introverted and passive, which makes them easier to take advantage of by middle managers who are not afraid to bully them around.[3] So when tech workers get laid off after age 40, rather than fight and adapt, they blame ageism and go quietly into the night. However, the next generation of tech workers seems to be a lot more assertive in their comp and career expectations. I know I am.
Three, a combination of the above.
Fourth, none of the above and I am full of $hit. In which case, I am interested in others' thoughts.
[1] Law, medicine, finance, accounting, etc
[2] A quick story on how corporate culture can change between generations. When IBM was first trying to sell their PCs to businesses, none of them were buying. So they hired a consulting company to figure out why. The consultants came back and said to IBM that the problem with their PCs not selling was an actuarial problem, not with their strategy - the younger people at those companies were excited about new tech, but the decision makers (many of whom were much older) did not see the value of new tech. Consultants told IBM to wait until the decision makers retired. The rest is history.
[3] I have seen this play out, where tech workers are too intimidated to push back against poor business decisions, like over-optimistic delivery deadlines.
>Question - why is it that experience / age / wisdom is an asset in other professional trades [1], but it is seen s as a liability in tech?
Silicon Valley (software) Culture.
tech (software) company want cheap labors and believed youth can overcome any difficulties. its deep rooted in Silicon Valley culture. From 19 yrs old Bill Gates, Steve Jobs, Mark Zuckerberg, Evan Spiegel...etc.
While on the hardware side, most of engineers/founders are much older because hardware is more unforgiving than software.
I think software professionals' problems go deeper than biased perceptions of management that undervalue experience. Unlike other forms of engineering, software hasn't developed the same feedback-driven processes to ensure efficiacy, reproducibility, safety, or reliability -- where standard metrics of practice have been developed through methodical exploration, optimization, and repetition. To wit, we still don't know how to make software 10% better. Ergo, we don't know in what way a more experienced developer can add value over a newbie. So why pay older devs more or retain them, when it's easier and cheaper to boss around neophytes who don't have family distractions or need health care?
Professional advancement in any field requires mastery. But instead of building "on the shoulders of giants", as is central to masters of science and of classic forms of engineering, the definition of software mastery remains qualitative and vague. Historically, evolution of software tools, practices and standards has been driven less by the systematic learning of what techniques work or don't, and more by adoption of styles and fashions: "This programming model/tool is better than that alternative. How do I know this? I just do, and I'm a smart guy."
Maybe it's not surprising then that even software managers, who came up through these ranks and presumably should respect the evolutionary enrichment and advancement of software pros, don't. In the end, it's cost and trendiness that rules the software dev hiring process, rather than experience-driven insights into S/W practices & principles or battle-hardened expertise in building better apps.
My theory: When I was staring out, the company “gray beards” were highly valuable and well-respected. Then “tech” as a meaningful name for a business vertical became diluted to the point that tech wasn’t necessarily valuable to “tech.”
GE created NBC, but NBC isn't branding themselves as an engineering company. GE could realistically be called an engineering company despite operating in many other markets, and NBC was probably instrumental in cutting age broadcast tech for decades and decades, but that's not the business they were in.
I have a slightly different hypothesis on #2.
Tech workers are busy. Learning and we are overloaded with actual work. We don't have as much time to play office politics. Also, A lot of the problems we face are hard to communicate and we end up facing them in isolation.
... and the methodology seems poor too. I had a sideline in magazine journalism for a while... I'd be interested in the correlation between article bias (pro / anti) versus age of writer (or target readership)
It's a shame that this paper uses a URL shortening service intermediary for its links. Eventually, those links will become useless at discretion of that service and you can't even use them in the WaybackMachine either when that happens. Not only because those URLs are unremarkable on their own (I doubt the bitly links are present elsewhere on the web so the WaybackMachine would be triggered into archiving them), but also because I doubt the WaybackMachine would scan PDFs for links.
1) Ones who keep up their skills
2) Ones who don't
The former are a treasure trove of knowledge and skills, and provide substantially more value than anyone junior ever could. Going through so many computing eras gives a higher-level way of thinking about abstraction, or understanding computer architectures. They've hand-tweaked assembly, C, Java, and when they're now doing JavaScript or Python, they understand all the layers of metal underneath. They've gone through flow charts, structured, functional, object oriented, and all the variants there-of. They've written high-speed algorithms to draw lines with pixels, to ray trace, and are now coding GPGPUs.
The latter are liabilities, bringing in 1980-era best-practices. They're working on some legacy BASIC or COBAL system from the seventies, and surprised they can't find a new job when that's upgraded and they're downsized.
I've rarely seen #1 and #2 mix. They're very different crowds, and in very different types of companies.
My own experience is that I need to devote about 20% of my time to keep up, if I'm doing it on an ongoing basis, or about 1-3 months every 2-5 years if I do occasional deep dives. Basically, I dive headlong into whatever is the newest, trendiest stack, and get a product out using that, deeply learning all the deep things behind it too. That's what works for me. YMMV.