[doc] [draft] The long-term documentation plan
Enrico Zini
enrico at enricozini.org
Fri Oct 15 16:53:20 CDT 2004
Hello,
And finally, the long-term documentation plan I've been working at,
going out for more peer-review. We can roast it here a bit, then if we
like it we put it in the wiki next week.
* Proposal (version 0.3)
An important requirement for writing documentation is knowing as much as
possible about the users and their goals, to be able to give them
appropriate guidance on what is really relevant to them[1].
There are two common ways of explaining in the free software world:
writing documentation and explaining things in a mailing list. While
both ways have the same goal of helping people to use an artifact
efficiently and with satisfaction, the former is very frustrating, the
latter much less so. Moreover, frustration caused by varying approaches
on the former can cause a complete turn off in users, and thus
acheiving the exact opposite goal to education.
One of the differences between these two similar activities, is that
when we write documentation we have to guess who the user is, and what
(s)he might be trying to do; in a mailing list, on the contrary, this
information about the user is evidently the core of any well written
help request. As a result, I propose to exploit the pattern of the help
request to concentrate documentation efforts on writing relevant and
targeted documentation.
The core of the proposal is trying to establish, maintain and evolve in
the user community practices of collecting and structuring the
information that is usually discussed and shared in the community
mailing lists.
Requests for help may come explicitly (for example actively asked by
users in a mailing list, or asked with the existing bugzilla
infrastructure by means of a "help" or "documentation" (pseudo)package),
or implicitly (for example, hidden in some list discussion or coming as
a PEBKAC bug report that can be reassigned to the documentation team, or
reported by the developers as a recurring pattern of something going
wrong, or by users expressing interest in knowing about something).
Once a "documentation issue" is opened, all the community is welcome to
participate on its "resolution". Simple queries could be solved by a
Google search and writing a summary of the findings, while other queries
could turn out to be complex and require longer discussion, or even be
"promoted" (or demoted) into real bugs. Upon claims of a closed
documentation "bug" a testers team might be the tier to make sure it
does achieve the goal and may reopen the bug for further work or discussion.
Usual community patterns could be used to foster users participation in
the documentation process. For example, a bi-weekly newsletter could be
created, listing what's new with the documentation, what are the open
issues, what best practices have been contributed and by whom. Symbolic
awards, like a "Genius of the week" mention, could be used to encourage
users to submit tricks and practices that they invented or adopted and
they feel would be worth sharing. Periodical surveys can also be conducted
to identify the most urgent matters and help in deciding which issues needs
documentation first and foremost.
The existing Plone CMS could be used to disseminate, collect and give
structure to the pieces of documentation that are produced, creating a
hyperlinked body of knowledge. A link to the new Plone documentation
item could be posted when closing bugzilla requests for help, or in
reply to a mailing list help request, depending on where the request was
originally posted. A big added value of this would be to enormously
increase the quality and usefulness of the user list archives.
The concept of a documentation item should be constructed and negotiated
with the community. I propose, however, some qualities as a starting
point:
- Short and terse: when I am looking for help, it means that I have
some work to do, and I should lose as little time as possible on
other things;
- Complete: Without damaging the generality of the former, measures
need be taken to make sure a solution is inclusive and complete, so
to avoid causing an early turn off in documentation users.
- Relevant: the documentation should answer the question or the need
that brought to its creation, and not digress on other things;
- Contextualised: it is important to know if the proposed solution
applies to my case, and not to a problem which is similar, but not
related to me;
- Easy to find: there should be a clear path leading to the
documentation, with ideally no backtrack;
- Networked with further information: if I want to know more, I should
have a chance to.
Having some context could allow to write documentation that is brief and
right to the point. Possibly, it should contain links to external
sources for readers to learn more or get more detailed explanations. A
PURL[2] system could be set up to generate persistent external links that
everyone could use, where persistance of links is a concern. Also, in
case of reorganization of the body of knowledge in Plone, special care
should be taken in maintaining validity of the old links.
Maintaining links would be important, among other things, to stay
networked with the huge existing body of documentation, to avoid the
impossible effort of rewriting what already exists and to value the work
that has already been done; plus, it would allow the created knowledge
base to act as an interconnection point for existing resources.
Using the PURL[2] system as outlined, an object oriented approach could
be applied to links maintainance and document cataloging. Treating small
draft quality documents, howtos and small snippets as "objects" - building
blocks for rather larger, more comprhensive sources of information as guides
and books.This could also speed up immensly the community documentation
process starting with Milestone 1, to completion.
Published pieces of documentation should have a small web forum for
users to post feedback, insight and corrections. This feature comes for
free in Plone, and this approach seems to work quite well, for example,
in existing knowledge bases like http://www.php.net/manual/en/ and
http://dev.mysql.com/doc/mysql/en/index.html
In order for the documentation to have a chance to really satisfy the
request for help, there should be some special care in understanding the
questions. I identify two types of questions, to begin with:
- Goal-oriented: the user has a goal, but doesn't know how to reach it.
In this case, it would be helpful to get some information about the
user and his/her environment, in order to suggest an appropriate
strategy. A reply to this type of question should be in the form of
a short tutorial.
- Incident-oriented: some problem happened, and the user doesn't know
how to solve it.
In this case, simple investigation like the Flanagan Critical
Incident technique[3] will prove helpful to get an insight on the
problem. A reply to this type of question should be either
troubleshooting documentation or the opening of a proper bug report.
Other kinds of questions could be identified over time, and appropriate
strategies could be developed to cope with them.
A very important side effect of this relationship between the
documentation teams and the users is that while collecting information
on the various user question, the documentation community inevitably
collects information on what the user's goals are, and what are their
most common problems.
The questions asked, as well as the statistics on the most visited
documentation items, can be an important source of feedback, an insight
on the most common uses of the distribution and important hints for
improving its quality. This could turn out to be an important source of
insight on the users of the distribution, insight which is often quite
rare to find in most general-purpose free software projects.
Moreover, such insight and infrastructure might turn out to be a useful
foundation upon which Ubuntu's QA team could be realized.
Ideally, this insight of the community by itself, created by means of
the documentation work, could enable the community itself to produce a
monthly or bi-monthly "letter to the developers, from the users"
collecting and synthetizing what are the open issues, the areas of
excellence, the dreams about the future.
[1] Hackos and Redish, "User and Task Analysis for Interface Design",
Wiley & Sons, 1998
[2] www.purl.org
[3] http://www.air.org/overview/cit.htm. also see Fitzpatrick, R. and
Bolton, M. (1994) „Qualitative methods for assessing health care“ in
Quality in Health Care 3: 107-113
* The team
The documentation team should be open to community participation, to the
point of being mainly composed by the community itself.
I envision a documentation-oriented community as being made of a
majority of advanced users rather than developers. Considering how many
times we heard about the idea that people could contribute to Free
Software by exchanging tricks and practices instead of code patches, I
see the potential of finally being able to realise it.
Ideally, the Canonical staff could consist of one, or few, facilitators
whose main role is not writing documentation, but rather: fostering
participation; mapping and networking existing efforts; collecting and
organizing the contributed knowledge; keeping the community together;
helping the community to evolve. The facilitators would also be responsible
for statistical research and reasoning, quality assurance and maintainance
of community documentation supportive infrastructures. (e.g. PURL, Plone etc..)
Action should take place in normal users mailing lists, with the idea of
facilitation work analogous to what is sometimes done in some developers
mailing lists. Patterns can be applied like the periodic newsletter,
FAQ documents for newsgroups, participation statistics and symbolic
(honorary) prizes. Some effort could also be done in educating the
community itself in basic problem solving, by publishing some reporting
and investigation guidelines, Google search tricks or even suggestions
on how to constructively cope with frustration and how to clearly
identify one's (or one's own) goals.
Some periodic sanity self-checks should be performed by and on the
facilitator(s) in order to re-assess the relevance of the ongoing
initiatives and strategies against an evolving community, and possibly
re-target and re-plan the interventions as the environment changes.
There should also be no special expectation with regards to community
involvement.
* Actors involved, and their tasks
- Users
- browsing and searching the documentation online
- asking questions
- interacting with the documentation team and among themselves
collecting the insight needed for the documentation to be written
- contributing creativity to the discussion
- writing short tutorials and troubleshooting guides, possibly based
on messages they wrote to the list
- contributing to the body of knowledge
- proposing, commenting, improving solutions
- Testers, Reviewers, Documentation Sounders
- Checking that the documentation is correct, as of language
- Checking that the documentation is correct, as of procedures
- Take responsibility for some part of the documentation and do
the same as users, but in an agreed format, and in a timely
manner.
- Documentation team
- collecting the questions
- identifying coping strategies
- interacting with the user to gather insight
- finding existing related resources on the net
- watching user comments on the knowledge base, moderating them and
integrating them in the documentation
- reorganizing the body ok knowledge so that the existing items are
efficiently found:
- aggregate overlapping documentation items
- split items that are too broad,
- write index pages or meta pages
- check for broken links in pointers to internal or external
knowledge
- produce bug reports
- elaborate new strategies and patterns to handle new kinds of requests
- identify recurring patterns
- provide the community with more knowledge and instruments to
improve the quality of its mutual help
- produce reports on what people's goals are
- produce reports on what people's problems are
- Developers
- reporting commonly asked questions to the documentation team
- handling bugs
- getting aware of documentation team reports
- Marketing staff (if it exists)
- Analysing documentation team reports
- Analysing view hits and popularity of the existing documentation
* Development milestones
Milestone 0: starting the system with the existing tools
- Bugzilla can be used to report questions for the documentation,
and for gathering insight from the reporters (for example,
creating a "help" or "documentation" (pseudo)package)
- Documentation items can be added to the existing help center
(http://www.ubuntulinux.org/support/documentation/helpcenter_view)
- When closing the Bugzilla links, post a link to the documentation
in Plone
- If possible and not yet existing, an e-mail interface to bugzilla
should be set up to allow Cc-ing list threads, in order to be
collected and archived as relevant to the more tricky open issues.
- The documentation team should be identified
- The rules of the game should be identified and put down clearly,
in order not to create false expectations
- This possibility of getting help on Ubuntu should be made known,
at least to those users that can handle the bugzilla interface
- The new documentation practices should be documented, presented to
the community, advertised and explained
Milestone 1: structuring the knowledge base
- Improve the structure of the online documentation, creating ontologies
and categorising the existing pieces of information, producing
indexes and other meta pages, aggregating overlapping items.
Milestone 2: improve problem solving documentation and infrastructure
- Some simple form of documentation can be produced (guidelines,
checklists, forms, hall of fame/hall of shame, tips...) to try to
improve the quality of the community process of going from asking
the questions to creating the final documentation piece.
- Create a specific interface to make "asking questions" more
efficient: on the one hand, the complexity of Bugzilla can be
hidden for this specific application, presenting an interface
which is more like an Enquiry Tracking System than a Bug Tracking
system; on the other hand, more specific triage and collection of
information can be coded in this interface.
Milestone 3: feedback infrastructure for the documentation team
- Depending on the gained experience, it may be important to design and
implement specific infrastructure for aggregating the feedback that can
be collected from the various parts of the system: goals in goal-oriented
requests, problems in problem-oriented requests, popularity of existing
documentation, direct feedback on existing documentation via simple
"problem solved? [ ] Yes [ ] No: ________________" polls in the
documentation page.
- Tests can be conducted on some loose feedback form format, which could be later
strictend and processed by machinery to provide automatic polls and statistical
data to aid and complement future goal, decisions and directions
Milestone 4: feedback infrastructure for the other staff, and possibly also
for the users
- The resulting feedback informations need to be presented to developers
and marketing staff (if they exist or if they are not already involved in
the interaction with users) in a language that they can understand. An
appropriate format should be identified (for example a statistics page,
or a newsletter) and the appropriate infrastructure for its production
should be set up.
- Collecting results from tests conducted in regard to feedback infrastructure
formats, one is to be agreed on. After which a funtionality can be manifested
on the desktop to accomodate a straightforward approach to feedback submission.
* Measurable goals
Milestone 0: starting the system with the existing tools
Questions should be asked, and documentation should be produced. Some
possible metrics to be collected are:
- Number of questions asked via bugzilla
- Number of questions collected on the mailing lists
- Number of questions reported by the developers
- Number of questions that produced a documentation item
- Number of questions that produced a bug report
- Number of questions that were considered unreasonable
- Number of questions that have already been answered
- Time elapsed between the initial report and the report being closed
- Number of mails exchanged with the reporter in order to get enough
insight to resolve the issue
- Number of people authoring documentation items on plone
- Users feedback and comments on the initiative
It is to be expected that question gets asked, and few of them be
considered unreasonable. Else, there may be problems in how the
documentation/help service is presented to the community.
Question reported by the developers are expected to be the majority, as
they are the ones that have been handling the relationship with the
community so far. The number of questions collected on mailing list
would also be high, as mailing lists are already an established
communication channel.
Milestone 1: structuring the knowledge base
As the body of documentation grows and its navigation and searching
interface scales, there should be variations in the metrics reported
above. It is expected that the number of question asked gets lower, or
at least that it does not explode with the increasing size of the user
base, since common and recurring goals or problems should be easily
found by users on the documentation base.
In particular, the number of question asked that have already been
answered should get lower and tend to the zero if the presentation of the
knowledge base is really efficient.
If this does not happen, instead, analysing which are the questions that
get re-asked can provide insight on shortcomings of the archive
presentation or of the produced documentation, or highlight some obscure
areas of the distribution itself that need more work.
Milestone 2: producing infrastructure to improve asking questions
The efficacy of this step should be measured counting the number of
requests reported on the new interface, and with a decrease of requests
made on other mediums.
Other parameters which are expected to change are the time elapsed
between the initial report and the report being closed, as well as the
number of mail exchanged for every report, since one of the main goals of
the reporting interface is to improve triaging and collection of
information.
Milestone 3: feedback infrastructure for the documentation team
This would introduce a new metric on the quality of the documentation
items, showing which one should be improved and how.
Better quality of the documentation should cause a reduction on the
number of requests, both new and duplicate.
Milestone 4: feedback infrastructure for the other staff, and possibly also
for the users
Feedback for other staff should help improve the quality of their work,
thus producing appropriate variations in whatever measurements are
already in place for their jobs.
Including the users in the report could allow us to see in the reactions
if the user community recognises the results, or if the gained
understainding is at risk of being flawed, biased or partial.
Ciao,
Enrico
--
GPG key: 1024D/797EBFAB 2000-12-05 Enrico Zini <enrico at debian.org>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: Digital signature
Url : http://lists.ubuntu.com/archives/ubuntu-devel/attachments/20041015/3902ecb4/attachment.pgp
More information about the ubuntu-devel
mailing list