14 min read · 2,626 words
KT is KIA
In the spirit of peeking behind the curtain… I’m exhausted and didn’t feel like writing this week lol. But I also think that it’s been a really good mechanism to get my brain going and get me OUT of feeling the malaise of… gestures at everything in the world all this.
So with that, I wanted to talk about how we learn and how we teach, because again… it’s a thing I think about a lot! (which is two words, unless you mean allot but then that’s something different than a lot, which is where my brain is going and this is your indication that today will be much more stream of consciousness than it even is normally lol).
And admittedly, the headline is a little misleading, or at least it feels misleading to me considering that normally when we’re talking about “knowledge transfer” in tech, let’s be real… it’s in the context of someone finding a better gig and no one understanding what they did so we frantically try and get their knowledge out of them and into some form of documentation and other people’s brains as quickly and efficiently as possible. But I don’t want to talk about that. I want to talk about how people learn to be an effective engineer within a work environment lol.
This isn’t new territory for this substack - we’ve already talked about the Mentorship Desert, the Help that Hurts, and the Efficiency Trap - but this is different, because today I want to talk about the other side of the coin, since most of what we’ve been talking about is “how does this affect new engineers”, but I think that one of the things that we’re making an assumption about is that the leaders and mentors are benefiting from the current state of… well everything… but one of the skills that you always needed as a leader or a mentor was the ability to teach - and I think that we’ve also completely lost the muscle memory of how to teach and train, not just abandoned the folks who need to learn.
I’ve been treating these explorations like if we could just give these mentors and mentees time and space, the mentors/leaders could just instantly start doing the level of training and teaching those lower experienced engineers need. But what evidence is there that would be true?
And I have to be honest, this might not be a “new” problem. I’m sure the majority of people never had mentors that did a great job of training them and had to just figure shit out. But I think that even if we only ever had 20% of the mentors out there that could really teach, that number is drastically lower now than it’s been in my lifetime.
There are a few main culprits here:
- We’ve diminished the role of teachers in society and made it so that education isn’t valued
- AI and stack overflow before it as a “stand in” for mentors disincentivises leaders and mentors from spending time training and teaching
- AI as a stand in for code reviewers means taking training opportunities and turning them into disambiguating “tasks” instead
And just to shake things up, let’s take these in reverse order!
One of the promises I saw with the advent of AI was the ability to take mundane tasks that no one wanted to do and automating them to create value without creating “churn”. I loved to focus on documentation, because no one ever likes writing documentation and it’s always the first thing that gets ignored when we start getting into time crunches. But what I’ve seen instead, is that the focus tends to be more on things that are tedious but necessary and done by the more senior engineers… namely code reviews.
And I feel like this is where people lean because the nature of code reviews is just misunderstood. It’s seen as a check that ensures that we’re writing consistent and clean code, and that’s a portion of why it exists, and it DOES create better code. But it’s not just because the senior devs are out here making sure that we’re formatting our code consistently or catching bugs.
It’s beneficial because it’s a training tool for the rest of the team to teach them about standards we expect, why some solutions are better than others, and philosophical debates about why we write code the way we do.
The best code reviewers I know are ones who will ask first “why did you do this in this way” rather than just rejecting it because it’s done differently than they expect. And the reason is, that sometimes, your way isn’t the best way! And that’s a part of teaching that I think is neglected - often teachers learn new ways of teaching and new ways of doing things that are different by teaching! Otherwise they are stuck in their own perspective.
How would you know that this weirdo way of writing code results in a solution that is more elegant and future proofed than the standard unless someone on your team is out there trying to break rules and see what they can get away with? And how would you know to try it if an AI is telling you that it isn’t the standard, instead of interrogating why they did things this weird new way.
And if you descope code reviews from the responsibilities of the mentors and leaders, then that muscle atrophies. And it doesn’t just come back. Even when you realize that the AI code review tools don’t improve performance, trying to go back requires those mentors and leaders being able to re-flex those muscles - but if those muscles have already atrophied or have never been built in the first place, it becomes incredibly difficult and painful to build them back up.
On top of that, we all know the types of code reviewers that already have predisposed things that they’ll immediately approve or reject just because of their preferences lol — and they really aren’t helpful. And likely, if we abandon the practice and then try and restart it… those are the types of reviews we are going to see instead of the intentional teaching/training versions!
The next culprit is the outsourcing of mentorship to AI and before that Stack Overflow. It used to be that when senior developers didn’t have time to mentor or help with issues with newer developers, they’d push them to SO to go and try to find answers for themselves, and now it has evolved into AI. And I get why they do it! Leaders, mentors, etc are stretched thin and need to find a way to help their junior developers without spending all of their time on developing those skills since they also have code development responsibilities!
And let’s be clear, some of the time it’s necessary to go out to an external source, and stack overflow and AI can be good learning tools! But when you are under the gun to get something delivered and just need it to “work”… you know how developers will be lol. They’ll just copy pasta, and pray something will work, and if it does awesome and if not, they’ll try again.
The incentive structure for them is to just get it done, not figure out how to get it done. And that’s reasserted over and over again, and treated like the “right” thing to do but goes back to our other conversation about when the helping hurts.
But we didn’t really talk about how this also is a specific harm to the leaders themselves. And we’re framing it as benefits, when I think it’s actually missing the mark. And I’m not immune!
I’ve had numerous conversations about AI shifting the workload of leaders away from training and spending time with juniors and toward solving the “harder” problems because… it sounds great logically! That time IS valuable for leaders, and they ARE more equipped to solve the hairy problems that no one else can solve.
But it loses so much of the mentorship that they’re ALSO benefiting from! When they’re forced to mentor people, it forces them to re-evaluate why they’re doing things the way they’re doing them, and identify where they can simplify things to make it easier and easier to explain and do the things that they need to do on a day to day basis.
Which means that companies become more and more reliant on the senior developers because… they’re the only ones who actually UNDERSTAND how everything works! The juniors instead of learning to understand are now just code jockeys putting in things and crossing their fingers, which creates more work for the seniors and becomes a never ending cycle.
And finally, this all comes down to the degradation of education in the United States specifically. And I’m going to sound like an old man yelling at a cloud here (don’t worry I have whole “old man yells at cloud” section coming up to be a little more self aware lol) — but we just… don’t respect education and teachers like we used to. And maybe again, we never really did.
But this is even more impactful because the only teachers we DO respect are the ones teaching hard skills and not soft skills. We’ll promote the hell out of STEM and then ignore English and History.
I don’t think it started as malicious, I think it was more focused on the idea that those hard skills have specific utility that we wanted to focus on in society. And then it was, we want to ensure that we don’t leave women and girls out and so we should have more STEM programs for them! And then it became “what ISN’T an engineering problem?!” And then we got the oligarchs in Silicon Valley coming up with new and novel ways of keeping our attention while also rotting our brains and it’s just a cycle that’s led to technocrats being the ideal and anyone who shows any soft skills being seen as, well… soft.
But the problem is, this misses the reason for a well balanced education. If you become myopic in focus, you don’t just become an expert in that field, but you start losing sight of things that aren’t in your field. And you start losing the connective tissue to ideas that are outside of your field but could revolutionize it!
As an example, I love a card game called “Set”. The entire intent is to train your brain to recognize illogical connections. You shuffle a deck, you lay out 12 cards, which can be any of 3 different: shapes, number of shapes, colors, or fillings. And then you find “Sets” of 3 cards, where each of the three potential differences needs to either be all the same or all different. It’s hard to explain but go play it here, you’ll have a good time: https://smart-games.org/en/set_classic/start/
In any case, the whole point of the game is to teach your brain these illogical connections, which allows your mind to start connecting seemingly unconnected things in novel ways. But there’s not a real “utility” to the game or the exercise. I can’t show you a specific outcome. You don’t “win” the game, you just play it.
That kind of learning isn’t valued in our society because, how is it going to make you money? How is it going to put food on your table? Get you a better house? Make you healthier? Or more efficient?
We’ve stripped away the things that make us human in exchange for technology in a way that isn’t at all helpful to us as humans! And learning is the POINT of being a human. What other animal takes a year to figure out how to do something as basic as walking? What other animal takes decades to have a fully mature brain? (I’m sure there are ghosts screaming at me saying “BUT WHAT ABOUT XYZ” and I’m sure you’re right ghosts, but that’s not the point, the point is that we are by our very nature, learners.)
Let’s be real here though, some of this is probably just an old man yelling at a cloud. And I don’t want to be naive and think that “I know all the problems and solutions” because I for sure don’t, I’m using these Tuesday posts to literally just say the things that I’m thinking about and hope they resonate in some way so that we can try and stem the tide of stupidity that we seem all too eager to follow.
So recognizing that… Some things may just be better this way. Specifically, this could be all positively aligning toward doing away with unnecessary processes and guard rails. If people don’t NEED a mentor, if they’re self starters, and are using the tools in a more effective way than I’ve been positing and using them to truly learn and understand why solutions work the way they do, that’s an awesome development. Similarly if they are able to take those learnings and find new and novel ways to explain their thought process to their leaders, those leaders are also learning and alleviating the concerns I posed!
I might also be discounting just how disruptive junior engineer training is, and by identifying ways of just reducing the noise maybe it’s not as harmful to teaching as I think and I’m just not a fan of the motions we’re taking. Or maybe I’m just in a pessimistic mood and don’t want to admit that AI or SO or Education are all actually in good spots and I’m just not a fan of the latest methodologies that will make our world better.
But I think that even if I’m wrong on some of these things, and even if it’s just my inherent bias, it doesn’t change the premise that learning is a two way street, and by abdicating the role of teacher from lead engineers creates a knowledge gap. Even if that gap isn’t with the learners, it’s for sure there for the teachers.
Einstein said, if you can’t explain it simply, you don’t understand it well enough, and I think in the end, that’s where my brain is heading. Granted, I used a couple thousand words, when Einstein probably would have said a hundred would do. But the point is, if you never have to explain yourself, you won’t get to a deeper understanding because you won’t need to find the simplest way to explain it while still being accurate.
And that exercise, painful and fraught with failure as it is, is critical to being a leader and a mentor. Losing it isn’t just “progress” or “a new technology” — it’s a fundamental shift in the way we communicate and learn and it’s not going to end positively unless we find ways to inject teaching and learning into our processes rather than algorithms and chat bots that can just tell us we’re brilliant.
With that… Claude… how well did we stick to our outline…
Critical Assessment: Your actual piece is more honest and nuanced than our outline - the uncertainty and self-questioning makes it more compelling than a confident diagnosis. The Set game example and the bidirectional learning insight are genuinely fresh angles we missed.
The education section feels a bit tangential, though it does set up the broader context for why teaching skills are devalued.
Overall, your instincts served you better than our outline - the humility and exploration make it more engaging than the more systematic approach we mapped out.
Look at that, it thinks I did well. Shocking right?