(Translated by https://www.hiragana.jp/)
⚓ T329506 User Facing Metrics Platform Documentation
Page MenuHomePhabricator

User Facing Metrics Platform Documentation
Closed, ResolvedPublic

Description

Hypothesis owner: @apaskulin
Hypothesis owner delegate: @VirginiaPoundstone
Start date: Jul 8, 2024
Target completion date: Nov 1, 2024
Actual completion date: Nov 1, 2024
Hypothesis ID: SDS2.1.5

Hypothesis

If we design a documentation system that guides the experience of users building instrumentation using the Metrics Platform, we will enable those users to independently create instrumentation without direct support from Data Products teams, except in edge cases.

Scope

In scope:

  • Creating an instrument
  • Data contract and schema design docs
  • Creating a place in the documentation system to add docs for to-be-developed features
  • Client library documentation
  • Template instrument docs standard

Out of scope:

  • Documentation for concepts that are core to role-based expertise such as the basics of experiment design and data analysis
  • Leading documentation for all features added during the fiscal year
  • Maintainer docs for MP core and clients
  • Migrating an existing instrument will be considered an edge case, and users will be directed to consult Data Platform

Ownership

Following the project's completion, the Data Products team will own all documentation created by this project as part of their overall ownership of the Metrics Platform. After the project, Tech Docs will be available for reviews and help through our regular support channels.

Project plan

Hypothesis result

This hypothesis was a success! In September 2024, the Growth team used the updated docs to create an instrument. Growth engineer Sergio Gimeno Saldaña stated that the docs were very helpful and that Growth was able to set up their instrument without asking Data Products questions about documented functionality. This fulfills the original criteria of the hypothesis to design documentation that enables users to independently create instrumentation without direct support from Data Products teams, except in edge cases. The Data Product team will continue to evaluate the effectiveness of the docs and iterate on them as more teams build instrumentation with the Metrics Platform.

Outcomes

  • Documented end-to-end workflows: I created a complete guide to create an instrument with Metrics Platform from start to finish, including following process that are required but outside Metrics Platform itself (like following the Data Collection Guidelines). Uncovering these hidden requirements allows users to feel confident creating instruments without direct support from Data Products.
  • Created modular docs to support UI tools: The ultimate goal of Metrics Platform is to provide a tool that allows product managers to create instrumentation with minimal technical complexity. To support this, I designed a documentation structure that was modular, with pages covering specific topics that can be linked to from these UIs to provide additional guidance for users to complete tasks using the UI. I then added these links to the current MPIC UI to improve the clarity and usability of the tool, which where critical issues identified in the MPIC UX study (SDS2.1.4).
  • Reduced duplication: When I started working on the Metrics Platform docs , the most critical issue was duplication. This is a very common problem across all docs since information tends to proliferate naturally. As part of this project, I consolidated the existing Metrics Platform documentation so that all information is now stored in a single place. To support this, I used interlinking, page transclusion, and section transclusion. This is evidenced by the fact that I removed nearly as much content as I added during this project (~124,500 bytes added verses ~115,600 bytes removed).
  • Ensured accuracy and consistency: With a rapidly evolving system like Metrics Platform, it's difficult to keep the docs up to date with the latest capabilities. I worked with Data Products to update the docs to reflect the latest system implementation. I also added citations in the docs with references to places in the codebases where functionality is defined, which will help ensure accuracy going forward.
  • Created templates to streamline processes: Data collection activities require steps that reach beyond Metrics Platform. Any team creating instrumentation must also complete processes owned by other teams, such as Legal and Product Analytics. To make these processes easier and faster to complete, I worked with Product Analytics to create a standard template for data collection measurement plans and an updated template for instrumentation specs. These templates will improve the experience of everyone starting data collection, not just using Metrics Platform.
  • Improved understanding of system functionality within Data Products: While the main goal of the project was to improve user-facing docs. The improved docs also helped increase knowledge of system functionality within the Data Products team. Metrics Platform includes a significant amount of complexity in the interaction between schemas and clients, so by documenting this, I was able to help increase knowledge sharing within Data Products and make it easier to onboard new system maintainers in the future.

Documentation

Recommended next steps

As part of this project, I identified these recommended next steps for Data Products to take to improve the docs, to be prioritized alongside their other work as they see fit:

In addition to these next steps, there are a few tasks related to this project that are still under review by Data Products. I'll continue to monitor any updates on these tasks over the next couple of weeks.

Risks

In the same way that there's always a risk of introducing tech debt in a codebase, there's an ongoing risk of introducing information debt in docs in the form of duplication. As Data Products works on potential training resources (T370183) and the experimentation scorecard (T374981), remember to integrate these efforts with the docs and not duplicate information.

Methodology

I used a unique methodology for this project that was tailored to documenting an evolving system. Typically, a documentation project begins with a complete audit of the existing content and design for the information architecture. Instead, to adapt to evolving workflows and to support emerging users of the platform, I started by focusing on improvements to a central document, testing it with users, and planning improvements to other pages from there. This allowed me to gain clarity on how the system works and reach a usable version of the docs early on, which supported the Growth team in using the docs to create their instrument mid-quarter. As a second step, I took a more traditional, top-down approach and audited the existing content to identify further opportunities for consolidation and clarification. As a final step, I looked at the documentation needs relative to the planned user experience and designed an information architecture to meet these needs. This three-stage process was effective in supporting early prototyping and frequent feedback cycles and could be used as a model for documenting rapidly evolving products.

Related Objects

StatusSubtypeAssignedTask
Resolvedapaskulin
Resolvedphuedx
Resolvedphuedx
Resolvedphuedx
Resolvedphuedx
Resolvedapaskulin
Resolvedapaskulin
Declinedapaskulin
Resolvedapaskulin
Resolvedapaskulin
Resolvedapaskulin
Resolvedapaskulin
Resolvedapaskulin
Resolvedapaskulin
Resolvedapaskulin
Resolvedapaskulin
Resolvedphuedx
ResolvedMilimetric
Openapaskulin
ResolvedSfaci
ResolvedVirginiaPoundstone
OpenJEbe-WMF
ResolvedSfaci
Resolvedapaskulin

Event Timeline

There are a very large number of changes, so older changes are hidden. Show Older Changes
DMartin-WMF changed the task status from Open to In Progress.Aug 16 2023, 3:11 PM
DMartin-WMF changed the status of subtask T329509: PHP Metrics Platform Documentation from Open to In Progress.
DMartin-WMF changed the status of subtask T329511: Java Metrics Platform Documentation from Open to In Progress.

The collection of wikitech pages starting with Metrics_Platform is in good shape IMO.

I would like to keep this ticket open while I take another pass through the content for completeness and suitability for beginners.

Reviewing this epic and think it is is too broad to be easily actioned on. @DMartin-WMF what are your thoughts about it? Do you have enough information here to confidently complete this work?

@TBurmeister adding you to this task because it would be great to work with you on how to structure our documentation process for metrics platform. Is this something you can help with?

Hi Virginia! I need more context and a clearer understanding of what type of assistance you're looking for. I don't think I can take on project work or formal advising for this, since (according to our planning doc), SDS2.5.1 is out of scope for the Tech Docs team this year; we'll be working on SDS2.1.1 and SDS3.4.1. Happy to meet with you and @apaskulin to discuss more, reorient, and/or clarify if my understanding is incorrect.

For specific documentation questions or general advising outside of planned APP-related work, the Metrics Platform team can always get feedback from technical writers by:

Thanks @TBurmeister I will discuss with team and you might see us pop into office hours.

DMartin-WMF subscribed.

I'm unassigning myself because I reached a good stopping point and I think the readability and usability of the current wikitech pages is acceptable for the initial "monoschema" version of Metrics Platform. Other folks will be updating the documentation for the subsequent "multischema" versions; I'm not so well-qualified to do that.

After further discussion with Virginia and Will, the Tech Docs Team will be picking this work up in July, moving to our backlog for now

apaskulin raised the priority of this task from Low to High.Jul 17 2024, 10:00 PM
apaskulin updated the task description. (Show Details)
apaskulin updated the task description. (Show Details)

This project is now complete! See the task description for outcomes.

I'd like to give a big shoutout to the Data Products team for all their help during this project. They were responsive and thorough in their reviews and always took the time to answer my questions. Their team processes made it easy for me to integrate with their work, even after the departure of their engineering manager. Thank you, Clare, Dan, Jennifer, Marcel, Sam, Santi, and Xabriel! Also a big thanks to Virginia for her product support and Sarai for her enthusiastic collaboration between docs and UX!