Local testing of imports
James Westby
james.westby at canonical.com
Sat Dec 12 02:24:36 GMT 2009
On Fri Dec 11 18:35:08 +0000 2009 Diogo Matsubara wrote:
> All oops generated by any of the LP application servers are processed
> daily and summarized on what's called an OOPS summary. You can see an
> example of one those summaries in:
> https://devpad.canonical.com/~lpqateam/oops-summaries/lpnet-2009-11-30.html
>
> Look at the All exceptions section:
>
> 55 AttributeError: 'NoneType' object has no attribute 'email'
> Bug: https://launchpad.net/bugs/452525
> POST: 45 GET: 10 Robots: 0 Local: 34
> 19 https://launchpad.net/branchfilesystem/translatePath
> (BranchFileSystemApplication:BranchFileSystem)
> OOPS-1430XMLP13, OOPS-1430XMLP14, OOPS-1430XMLP15,
> OOPS-1430XMLP16, OOPS-1430XMLP17
>
> oops-tools then aggregates all OOPSes that have the same exception
> type (e.g. AttributeError) and exception value (e.g. 'NoneType' object
> has no attribute 'email'), displaying a sample of those OOPSes along
> with the URL where that was triggered (and other information as you
> can see above).
>
> Once an specific OOPS instance (e.g.OOPS-1430XMLP13) is analysed by
> one of the LP engineers, a bug is filed and then linked to the OOPS
> report (Manually, usually done by Ursual or I). Next time an OOPS
> summary is generated, it'll know about the bug report and add a link
> with the aggregate OOPSes of the same type.
>
> OOPSes of the same type are identified by a signature, which is
> computed from the exception type and exception value.
>
> Does the above make things clearer about oops-tools? Would it be able
> to address your use case?
It sounds like it does.
I have a bunch of exceptions as files on disk. I want to try and put
those that are caused by the same issue together, so that we know
which problems are causing the most failures and the like.
The code I took from apport reads these files and generates a signature
from each crash that is basically exception class + function stack where
it was raised. This seem to work quite well for our needs. The step I
was missing when I sent the original mail was how to go from a bunch
of text file tracebacks to signatures that could be compared to distinguish
them. I think I have that now, but if you have suggestions for improvements
I would be glad to hear them.
> Can you send me the annotated version too? That way I'll be able to
> understand better your use case.
You can see it at
https://lists.ubuntu.com/archives/ubuntu-distributed-devel/2009-December/000183.html
Thanks,
James
More information about the ubuntu-distributed-devel
mailing list