<div dir="ltr"><p dir="ltr">+1</p><p>This looks really great and well thought out. I appreciate the effort that went in to outlining this. I think this will be extremely helpful.</p><p>Thanks,</p><p>Wayne<br></p>
<div class="gmail_quote">On Jul 17, 2014 10:28 PM, "Tim Penhey" <<a href="mailto:tim.penhey@canonical.com" target="_blank">tim.penhey@canonical.com</a>> wrote:<br type="attribution"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
Hi all,<br>
<br>
Last night in the team meeting we discussed the review process.<br>
Initially because there have been a number of things sneaking through<br>
reviews where ideally we'd like to catch them earlier.<br>
<br>
In order to help the newer folks on the team learn how we review, it has<br>
been suggested that we formalise the mentored reviewers process.<br>
<br>
Mentored Reviewers<br>
==================<br>
<br>
All juju core members are expected to review code. It is not reasonable<br>
to expect the newer members of the team to feel comfortable being the<br>
sole reviewer on work.<br>
<br>
I have created a new sheet for people to register themselves on if they<br>
feel they would like to have a review mentor [1]. You should discuss<br>
this with your team lead if you are unsure. A mentor will be allocated<br>
to you, or if you have discussed this with someone in particular, you<br>
can have a say in your own mentor.<br>
<br>
Reviews done by the mentored developer are not considered enough to land<br>
code until they have been approved by their mentor.<br>
<br>
It is up the mentor and mentee how they choose to do the reviews. They<br>
could do this over a hang out together, or the mentor could review the<br>
review after the mentee has done the first pass.<br>
<br>
Once the mentor feels that the mentee is consistently doing good<br>
reviews, they will announce the graduation, and that developer no longer<br>
needs a second review pass of their reviews.<br>
<br>
<br>
Guidelines<br>
==========<br>
<br>
Prior to starting any significant piece of work or refactoring, the<br>
general approach should first be checked with the team lead and<br>
technical architect. Smaller more obvious pieces of work don't<br>
necessarily need this, but it is often good to talk to someone about the<br>
problem and the proposed fix as an initial sanity check.<br>
<br>
The aim here is to avoid getting into design discussions at review time<br>
as this is clearly the wrong time to do it.<br>
<br>
As the developer, it is strongly suggested that you work to the boy<br>
scout rule - "leave the code cleaner than you found it". We don't<br>
expect perfection (or we shouldn't), but don't leave a mess. It is ok<br>
to push back on a reviewer if you feel that the reviewer is being<br>
unreasonable in their request for perfection. If you feel that this is<br>
the case, talk to your team lead.<br>
<br>
As a reviewer, if you don't understand the code or what is being done<br>
there are a number of options:<br>
* it could be that the code is confusing, getting a second opinion can<br>
help here<br>
* it could be that you don't understand the area of code being changed,<br>
and it isn't always feasible to grok everything completely. If the<br>
change isn't obviously correct (to you), get a second opinion<br>
<br>
A review that says<br>
<br>
The code looks good, test coverage seems to be complete, but I can't<br>
tell if this will actually fix problem X<br>
<br>
is still a useful review, and worth doing. The developer should seek<br>
someone to understands that area, or work with the reviewer to help them<br>
understand.<br>
<br>
In any situation where you need to stop and explain it can seem like the<br>
process is too slow, but sharing and explaining is short term delay for<br>
long term gain as the knowledge is shared through the team.<br>
<br>
<br>
Checklist<br>
=========<br>
<br>
I think it makes sense to have a check list to cover the basics of what<br>
we look for in reviews. Probably even get this into the docs directory<br>
in trunk.<br>
<br>
There is a document already which covers some basics of writing good<br>
tests: doc/how-to-write-tests.txt<br>
<br>
<br>
Initial things that I look for (off the top of my head):<br>
* firstly work out what the code is trying to fix (feature or bug)<br>
* is it obvious that the code does this?<br>
* for new functions, do the names make sense? do they say what the<br>
function does?<br>
* are all the public constants, variables, functions and methods<br>
documented?<br>
* how are the new functions tested?<br>
* can they be tested in isolation?<br>
* are all the new code paths tested?<br>
* do the names of the tests make sense?<br>
* could similar tests be combined into a table based test?<br>
* should a table based test be broken out in more obvious tests?<br>
(this sometimes happens when there is too much logic or too many<br>
conditionals in the actual test block of the table based test)<br>
* is this code valid for all operating systems?<br>
* are there any race conditions?<br>
* are the tests isolated from the environment?<br>
<br>
<br>
<br>
Cheers,<br>
Tim<br>
<br>
<br>
[1]<br>
<a href="https://docs.google.com/a/canonical.com/spreadsheets/d/1v9KB6Y4r1bMLOyB1JEs-wj_jBAvsq6EglWssa4cOx9c/edit#gid=0" target="_blank">https://docs.google.com/a/canonical.com/spreadsheets/d/1v9KB6Y4r1bMLOyB1JEs-wj_jBAvsq6EglWssa4cOx9c/edit#gid=0</a><br>
<br>
--<br>
Juju-dev mailing list<br>
<a href="mailto:Juju-dev@lists.ubuntu.com" target="_blank">Juju-dev@lists.ubuntu.com</a><br>
Modify settings or unsubscribe at: <a href="https://lists.ubuntu.com/mailman/listinfo/juju-dev" target="_blank">https://lists.ubuntu.com/mailman/listinfo/juju-dev</a><br>
</blockquote></div>
</div>