Note: if you're just looking for performance review templates and examples, head to the templates page to download them.
I've had about a dozen performance reviews during my decade-long software engineering career. Some of them were unmemorable, some okay, but a good chunk of them were just... plain bad. Often, these bad ones marked my "shields down" moment. The point I lost trust in my manager and decided that I need to change teams or companies. When I became a manager, I silently vowed to do whatever I can to not have those awkward and bad performance reviews.
This post summarizes my approach to preparing for and delivering performance reviews that are fair, build trust and motivate people. I also share some examples at the end of the post. I've delivered more than 50 reviews with this approach over 3 years, most of them well-received. I decided to write this post after presenting some of the concepts internally and on a local meetups, with a few requests to share it more broadly.
The post also comes with performance review templates and examples:
- Template and example for software engineers to write your self-review
- Template and example for engineering managers to write the reviews for your directs
As with any topic that's around management: take it as inspiration and a source of new ideas, not a recipe to follow. And take it as a way to challenge yourself: where can you improve on how you do performance reviews?
Note: this article is primarily aimed at managers delivering reviews. I'll be writing a follow-up post for tips to prepare better for your performance review, as an employee. Sign up to the newsletter to get notified of this.
What this article covers:
- Prerequisites for fair perf reviews: levels, competencies, expectations
- Bad performance reviews
- Three observations about perf reviews at tech companies
- How I do performance reviews: preparation, writing the review, removing bias & delivering it
- Example performance reviews - and templates
- What do the "pros" say about good performance reviews?
- Principles for fair performance reviews
Prerequisites for fair perf reviews: levels, competencies, expectations
For a performance reviews to be fair, levels and expectations need to be well-defined. If they are not, you can still get good feedback - but it will be subjective feedback. All good tech companies have clear levels and expectations/competencies explicitly defined. These are usually kept as internal resources, visible to all engineers/managers within the company. This was the case at most companies I worked at. Luckily, some companies are making their definitions public. A few examples are these:
- Monzo's progression framework for a variety of disciplines, not just software engineering
- Rent the Runway's career ladder, shared by Camille Fournier, author of The Manager's Path
- Square's software engineering career ladder. See their blog post to put this framework in context.
- Buffer's engineering career framework
- For an idea of levels within companies, see my article Engineering Career Paths at Big Tech and High-Growth Startups
- Take a look at Progression.fyi for more examples.
If your company doesn't have levels and expectations clearly defined for developers: this is a blocker to fair performance reviews. If you're a manager, make it your job to fix this and put together one, pulling together experienced managers and engineers to build the first version.
Bad performance reviews
A good chunk of my approach to perf reviews is driven by the bad reviews I've had. I made a list of the characteristics that were the bad thing about them:
- Zero specifics. The type of reviews where there are some fluffy sentences and zero substance. On the surface, the review might look good, with feedback like "Good work" or "Keep doing what you're doing". But when you ask for specifics, your manager cannot give a single example, or constructive feedback. While you might treat this review might read as a supportive message from your manager, think again. You've gotten no specific feedback on where you could grow, and your manager did not seem to spend as much as five minutes compiling your review. How much does this manager care about you?
- My manager is disconnected from what I've been doing. You've had a great period, shipping several impactful things. Yet, on the review, none of this is mentioned, only small, sideline things. How is it that they did not call out the big items? And since they never mentioned them, do they even know what you've worked on? In my experience: they likely don't. And since they don't, don't expect they will vouch for you when it comes to the promotion to the next level.
- The "let's skip straight to the numbers" review. On the meeting, you're excited: finally, a manager who doesn't keep you waiting, and gives you what you're waiting for: the bonus and pay numbers, then ask you if you have any questions. You are still thinking about how to spend that money - so you probably won't have questions, especially if the numbers look good. After the meeting, you re-evaluate if this is the review you really wanted. It was called a performance review, except you did not talk about one thing: your performance. And the next day you might feel too awkward to bring this up. This type of review is no better than the "zero specifics" review.
- The unfair, one-sided review. The kind of review, after which you walk out, with your blood boiling. During the review, you get some negative feedback, which your manager tries to deliver in a civilized way. So they accompany it with examples. Except those examples are one-sided, miss key details and are mis-interpreted. You might challenge this - only for your manager to get defensive. And, to be fair to them, they kind of have to, right? The rating determined your bonus, which cannot be changed any more. So after the meeting, both of you are frustrated. You are upset at how you've been treated unfair, and then ignored, when you pointed out missing facts. In the worst case, your manager is just is upset, but at you, thinking your concerns are baseless and that you were being unnecessary combative. What are the chances that you'll mend this broken relationship?
- The biased review. I'm not talking about the positively biased, but the negatively biased ones. There could be a recency bias, focusing on a near-term hick-up, but ignoring lots of great work from before. There could be gender bias, especially if you're female, with more focus on your behaviour than your accomplishments, or downplaying of these. Or there could be several other biases that can make you feel that your contributions are devalued.
- The "took me by a huge surprise" review. I left the most important one for last. If a review takes you completely by surprise - usually not in a good way, that's a fail on your manager delivering feedback on a continuous basis. Good reviews will bring lots of new information, but very little surprises. Period.
Three observations about perf reviews at tech companies
After having received over a dozen per reviews, talking with many others, and now being on the other side of the table, having reviewed or delivered close to 100 reviews, here are three observations I have of the process.
- Trust between employee and manager is the most likely to break as a result of a poor perf review. Bad perf reviews break trust, and are the first step to the employee looking for other opportunities. This was the case with myself and with several people I talked with.
- Too few engineering managers put meaningful time into writing perf reviews that have feedback of substance. Many managers I've had and ones I know cut corners on this activity, making up excuses why they don't have time for it. Lack of time is also often used as a poor excuse for the lack of thoughtfulness.
- Acknowledgment of achievements are just as important as a numeric rating and the numbers. My biggest disappointment with perf reviews were never with the numbers - even if they were below what I expected. They were because of the lack of acknowledgment of my work, and lack of reflection on my achievements and growth. While managers don't always have control of the numbers, they always have full control of how they say - or don't say - thank you.
How I do performance reviews
As I was writing my first-ever performance review, I wanted to avoid the bad performance reviews I've seen, as well as keeping in mind the impact these reviews have on people. But it wasn't just anti-patterns I wanted to get around.
I set goals on the outcome of the performance review:
- Fair, unbiased and clear feedback. The feedback should be based on expectations previously set - hence the important of levels and competencies - it should be calibrated and clearly state if the person was meeting, below or above expectations.
- Motivate. I aimed for both of us to walk out of the room motivated. How is this possible? Starting with acknowledging all the good work, and talking through what to focus on next. There is a tricky part here: what if the numbers don't support the message I'm trying to give?
- Build trust. As perf reviews are the biggest risk in trust between manager and employee, my goal was to turn this around, and strengthen the trust between the two of us. This requires, at the very least, the review to be an honest and two-way conversation.
With the goals set, let's get into how I deliver them. But before we get to the meeting, preparation is key. And when I say prepare, I mean preparing from the very start - months before, or as early as when the employee joins my team.
Preparing for the review: months before
You can't do good performance reviews without understanding the person, and setting baseline expectations early on. So as soon as someone new joins my team, I start with a few things:
- Understanding their goals and motivations. What are they hoping to get out of working here? What things do they care most about professionally: growth, titles, leadership, impact? What are their priorities outside of work? I also ask people what their dream would be to do after this job - we all know this is not their last job, or my last job, and I'd like to help them reach their goals beyond just this company and position.
- Clarifying my role, as a manager, and what they should expect from me. I always make it clear that their career is in their hands: that is, don't expect someone else to drive this. However, as a manager, they should expect from me - and hold me accountable - to give regular feedback, create opportunities for growth and mentor/coach, when there is an opportunity to do so.
- Regular 1:1 meetings which is time dedicated to them, and are places to share feedback both ways and talk about important things on their mind. If there's ever an issue on performance, I aim to bring them up here, discussing and resolving them.
- Clarifying the difference between promotions and professional growth. Especially for engineers early on in their career, many assume that the only way to grow is through promotions, and some decide to optimize for this. I have many thoughts on promotions, and this kind of thinking misses many parts of professional growth, so I make sure to talk it through, early on.
Preparing for the review: weeks before
I believe in the need to prepare for important meetings. The more important they are, the more preparation is needed. Performance reviews are perceived as one of the most important meetings for engineers - at least, when it's the first one with a new manager. So I prepare accordingly.
Collecting the substance of "what" and the "how" is where I spend most of my time. I want to make sure I get the facts right - and working with software engineers, much of the work of people is readily available. Here's the things I go through:
- 1:1 notes.
- Projects the engineer worked on. What did they contribute?
- Output produced: code, documentation, emails.
- Feedback they received: peer feedback, thanks they received over email or other means, and other feedback I can find.
- Feedback they've given: code reviews, planning documentation reviews, interactions with others
- Self-review: what have they collected on their work? I try to leave this last, to collect anything they might have missed out.
This part is one I spend a lot of time on, especially if it's the first time doing a perf review for a person on my team. My goal is to uncover all the great work people have been doing - even if it's something they might have forgotten. I keep a checklist that used to look like this - I update this at before every cycle:
Writing the review
Performance reviews have a recommended format - which I ignore, until it's time to fill it in the tool. Here's how I write my reviews, in a separate document:
Listing out "what" was done
- I start with the summary of the work done in the past period, based on my preparation. I usually list in time order, calling out the more impactful or challenging contributions.
- This list has three purposes. First, it acknowledges the work they did. Second, I make sure to pause on delivery and ask them if I left anything of impact out. If I did, we add it. Third, it usually reassures people that this a fair review - and people are often surprised at the details this list contains. It's a reflection of how much I've prepared for this meeting.
- I re-read all competencies, internalizing the expectations for meeting them.
- Using the work done, and the "how" of it was done, I determine if this person met, exceeded or was below the given competency.
- I always provide examples, and specific feedback for improvement. This is especially important when someone did not meet expectations: the feedback needs to be specific, and actionable. It's just as important when people met, but did not exceed expectations. For people exceeding, I sometimes give more stretch ideas.
Finalising the review
- Short summary: I add this summary to the bottom, making it clear where this person is against their level, and how close - or far - they are to promotion to the next level.
- Top 3 focus areas: I summarize three things I would focus on for faster growth, if I were them. I try to make these suggestions specific. This will be the main - hopefully motivating - takeaway of the review.
- I re-read the package, and make remove duplicate examples. I also make adjustments, so the overall message supports the summary, across all competencies.
Calibration and removing bias
We are all biased, I am no exception. To reduce the bias, I do a couple of things.
- Comparing pref reviews of people, side by side - with special focus on high- and low performers and people who are different to me. Did I hold the same bar? Am I using similar language and tone to call out achievments, or shortcomings?
- Sharing the reviews with my manager, asking for feedback on both calibration, as well as biases. Not only is this a good way to get feedback, but in increases visibility for people with my manager.
Preparing my direct for the review
As I'm done with the review some time before delivery is due, I make sure to prepare people - especially if I've found examples on not meeting expectations, that was not communicated before. I check these examples, and give a heads up on what kind of feedback they should expect during this session, and what growth areas we'll talk about.
Delivering the review
I always have a separate performance feedback, and a separate compensation meeting. I've tried different setups, and this is the one that worked best for quality discussions, and feedback. As soon as people hear their compensation numbers, their brain re-focuses and any conversation about improvement is moot. This is the case for myself, and it was the case for all my directs. So as much as people would love to know those numbers, I don't do it in the same meeting.
I start the meeting by setting expectations: what the format will be, that they'll get all of this in writing. I also make it clear that I'd like to have this be a two-way conversation and how I'll pause after each section/competency to hear their feedback.
I start by going through the achievements for the period - which is a good way to establish trust, and show that I've done my homework as well. Then, I go through the feedback on each area. I avoid the feedback sandwich, sending one message. I then pause and encourage people to share their thoughts by questions like "What do you think?", "Are there parts you disagree with?" or "How realistic did this assessment feel?". I pay attention to body language - when people tense up, they are likely disagreeing, and I try to have them voice why. Better talk about it now, than later.
Wrapping up, I ask them to reflect on the review with questions like "How do you feel?", "What was the most surprising part?", "What part do you disagree with?".
What if people disagree? I was vary of openly inviting people to voice their concerns, and sure enough, I had a few people tell me they felt a small or large part of the review was incorrect or unfair. I do keep an open mind on me not necessarily having all the context. So if I have missed information, facts or feedback, the review might not reflect their performance, indeed. This is another reason I start the review with their achievements - anything relevant I missed might impact the review.
When it's clear that I missed or might have mis-interpreted key parts, I ask for them to gather these examples on short term and do a follow-up meeting the next day. I have revised my rating in the past, when I missed important details, and the person rightfully called this out.
What if people disagree of my interpretation of meeting expectations? This is trickier, but resolving this is key, to keep trust between the two of us. This is the reason I have specific examples of not meeting or exceeding expectations and actionable feedback on how to improve, to get there.
Either way, I find it important to know as early as possible - during the review - if people are unhappy with their qualitative assessment. And this is also why I always decouple the financial conversation from this kind of feedback. People might be happy or unhappy with the qualitative feedback, but feel different about the financial numbers. One is easier to resolve, than the other.
After the review
The performance review is not quite over - there are a few more things to close it off.
- Sending over the written review. I shoot the review we talked through, over email. I know some people take plenty of time to read through this. I like to send my document over, even when we have a separate perf system, as in my document, I can highlight key growth areas in a more clear form.
- When people don't agree with parts of the feedback, we do follow up-meeting very soon. On this meeting, we stick to the specific areas in question. In almost all cases, I've found the disagreement to be on either missing information, or interpreting of specific examples.
- Bonus and salary numbers on a separate meeting. People might be happy with their performance review, but unhappy with the numbers - and that's a different problem. It's good to separate the two. I've even had a person who was unhappy with their perf review - and we spent time untangling this - but was ecstatic rise they were given by the central team. Had we done the two meetings at the same time, I might have not learned that he was unhappy with my feedback.
- Forward-looking goal setting. Especially important for anyone who was below expectations in any of the areas. It's useful to also start early for those exceeding expectations, who could start to prepare for promotion to the next level.
Example performance reviews - and templates
I've had several people ask for examples of performance reviews. Performance reviews will be different for each company, and can be specific to each manager. Still as inspiration, I thought to share what I have seen, and what I use.
For self-reviews for engineers to use as performance reviews, you can find these here. Here is how this example appraisal or evaluation looks like:
For performance reviews that engineering managers deliver, see a template for performance reviews that I use - I'm sharing it here as inspiration. If you're someone who does performance reviews, I encourage you to build your own. Putting one together can make focus points clear, ensure consistent delivery between people, and also reduce bias. Especially writing the feedback ahead of time, over making it up as you go is a powerful tool to deliver more objective and less biased feedback.
What do the "pros" say about good perf reviews?
Needless to say that the above was my approach: your mileage might vary and what worked for me, might not work for you. I was curious though: what are leading thoughts on good performance reviews in the management industry? So I turned to an authoritative source, the Harvard Business Review and reviewed some of their most read pieces on perf reviews. Here's what they have to say:
In Delivering an Effective Performance Review (HBR, 2011), Rebecca Knight suggests the following:
- Before prepare. Set expectations early. Lay the groundwork: prepare a few weeks ahead of the review. Set a tone early on and don't use the feedback sandwich.
- During: coach constructively. Ask them about how they are feeling. Talk about behaviours, not positions/ranks.
- Hold your ground. Separate perf review and compensation discussions. Hitting goals should be meeting expectations: don't inflate this by giving feedback that it's above expectations.
In the article Why Most Performance Evaluations are Biased, and How to Fix Them (HBR, 2019), Lori Mackenzie, JoAnne Wehner and Shelley J. Correll bring some specific suggestions for less biased reviews:
- Open box feedback (one with no structure) is a biased approach both for men, and for women.
- Constrain the form of feedback to remove bias: competencies help a lot. Evaluate against competencies or a rubric, ask more specific questions (e.g. "describe the way the employee met expectations).
- Run a consistency check across all your reviews, after you wrote them down, to remove bias.
- Unbiased feedback leads to better performance.
The key takeaway from their article is this:
What we learned is that ambiguity in assessments can lead to bias. They take this insight and find ways to use rubrics and prompts to be consistent and fair. It might be tempting to think we can just trust our instincts. But the challenge is that implicit bias creep in, and it’s really difficult to see and therefore stop it in its tracks.
Principles for fair performance reviews
Based on my personal experience, approach, and success, and cross-checking it with recommendations from experts publishing in HBR, here are a few principles worth considering for writing great performance reviews:
- It takes a lot more time to do it right. Find this time. Not enough managers put in the time, even though it's such a big win, when you do. Find the time for a proper, in-depth review, especially in the phase of building up trust with your direct.
- Engineering work is specific - use it for specific feedback. Engineers spend a lot of time writing code, documents, emails. We also communicate and work together. What this means is there is a lot of specifics available for the work of an engineer. Use these specifics when giving feedback. Instead of giving feedback on the person writing bad code, be specific. If collaboration could be better, use a specific project example. When praising, do the same.
- The more frequent the feedback, the better. Perf reviews should be a "not surprises" conversation. This article focused on preparing for the "major" perf review. The one that happens every 6-12 months. But to have a productive conversation at this event, you need to gather and give feedback continuously. Continuous feedback and well delivered formal performance reviews go hand-in-hand for helping people grow.
- People really appreciate getting quality feedback. Your directs will see if you put time and energy in a thorough performance review, and they will be very thankful for specific and actionable feedback. Most developers have yet to receive quality performance reviews. By doing so, you build trust, help them grow, and you also grow in becoming better at giving great feedback.
- Good perf reviews lead to more trust, faster growth and less attrition. I have seen good performance reviews pay off in the low attrition across my team, high trust and people growing and being promoted faster. I've gotten similar feedback when sharing my approach with HR professionals, many of whom were surprised by the larger-than-usual effort I put into this process. And the effort gets smaller, over time. Once you master some of these techniques, gathering feedback becomes easier.
If you're a manager, challenge yourself to improve at least one aspect of how you do performance reviews, taking inspiration from this article. There is very little excuse for managers to do poorly prepared, unfair or biased performance reviews - the upside is saving of some time, but the downsides are far more impactful. Let's change how we approach reviews and turn these stressful conversations to ones where trust is further strengthened, and people come out feeling motivated and determined to grow further.
Related to this topic, see the article How to spot and counter manager biases on performance reviews and watch my videos on performance reviews on YouTube.
Featured Pragmatic Engineer Jobs
- Senior DevOps Engineer at Polarsteps. Amsterdam.
- Senior Software Engineer at Ladder. $150-175K + equity. Palo Alto (CA) or Remote (US).
- Senior Software Engineer at GetYourGuide. Berlin, Germany.
- Senior MLOps Engineer at GetYourGuide. Berlin, Germany.
- Senior Software Engineer (Reporting) at CAST.AI. €72-96K + equity. Remote (Europe).
- Senior Software Engineer (Security) at CAST.AI. €60-90K + equity. Remote (Europe).
- Senior Sales Engineer at CAST.AI. Remote (Europe, US).
- Senior Frontend Developer at TalentBait. €60-80K + equity. Barcelona, Spain.
- Technical Lead at Ably. £95-120K + equity. London or Remote (UK).
- Senior Software Engineer, Missions at Ably. £80-100K + equity. Remote (UK).
- Software Engineer at Freshpaint. $130-210K + equity. Remote (US).
- Senior Software Engineer, Developer Ecosystems at Ably. £80-100K. Remote (UK).
- Senior Web Engineer, Activation at Ably. £75-85K. Remote (UK).
- Web Engineer at Ably. £70-75K. Remote (UK).
- Staff Software Engineer at Onaroll. $170-190K + equity. Remote (US).
- Staff Software Engineer at Deepset. Remote (US, Europe).
The above jobs score at least 10/12 on The Pragmatic Engineer Test. Browse more senior engineer and engineering leadership roles with great engineering cultures, or add your own on The Pragmatic Engineer Job board and apply to join The Pragmatic Engineer Talent Collective.
Want to get interesting opportunities from vetted tech companies? Sign up to The Pragmatic Engineer Talent Collective and get sent great opportunities - similar to the ones below without any obligation. You can be public or anonymous, and I’ll be curating the list of companies and people.
Are you hiring senior+ engineers or engineering managers? Apply to join The Pragmatic Engineer Talent Collective to contact world-class senior and above engineers and engineering managers/directors. Get vetted drops twice a month, from software engineers - full-stack, backend, mobile, frontend, data, ML - and managers currently working at Big Tech, high-growth startups, and places with strong engineering cultures. Apply here.