Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Discussion on responsiveness metrics #13

Closed
dicortazar opened this issue May 30, 2019 · 11 comments
Closed

Discussion on responsiveness metrics #13

dicortazar opened this issue May 30, 2019 · 11 comments
Labels
metric idea This idea is open for discussion. Ideally, fill out a detail page to focus the discussion.

Comments

@dicortazar
Copy link
Member

The goal of this task is to start the discussion about the several ways we can track responsiveness metrics.

From a broader perspective, this responsiveness metrics can be seen in several of the data sources for software development.

As an example, a code review process may have several steps such as:

  • the time to merge
  • the time waiting for a submitter action
  • the time waiting for a reviewer action
  • the time for each of the iterations in a code review process
  • the time to merge into master since a review is approved
  • the time for approval
  • the time to first response

In addition to this, we may find other measurable times at places such as in a mailing list, the first time to get an answer. Or in a more forum-based format such as Discourse or Stackoverflow where we can have the time to get an accepted answer.

With all of this in mind, and as a suggestion, we may start writing down the motivation of these metrics (mainly the goals we're following) and then start producing the specific questions and metrics.

As potential goals related to this, I have in mind a couple of them:

  • Efficiency in the software development process
  • Volunteers community care
@germonprez
Copy link
Contributor

germonprez commented Jun 13, 2019

What is the current state of Responsiveness metrics? Not sure if there are particular metrics that are being developed. Particularly

Time to close
and
Time to first response

@geekygirldawn
Copy link
Member

@mpgjon
Copy link

mpgjon commented Jul 10, 2019

A question if I may, is there yet a direction here on measuring the actual textual sentiment in responses? (EG - natural language processing to measure positive/negative sentiment in comments?). I understand there's something like this happening in CrossMiner and wasn't clear if that kind of measurement is completely out of scope for CHAOSS?

@GeorgLink
Copy link
Member

GeorgLink commented Jul 11, 2019 via email

@klumb
Copy link
Member

klumb commented Aug 8, 2019

I think responsiveness and sentiment are two different metrics. Responsiveness should just focus on time. Sentiment should be built separately and a composite or collection/dashboard of responsiveness and sentiment could be discussed.

@geekygirldawn
Copy link
Member

@dicortazar to look at the ones that aren't already included in some of our recently released metrics and add them to the release spreadsheet: https://docs.google.com/spreadsheets/d/1tAGzUiZ9jdORKCnoDQJkOU8tQsZDCZVjcWqXYOSAFmE/edit#gid=276406255

@GeorgLink
Copy link
Member

Daniel added metrics to the shared metric tracking spreadsheet:

https://docs.google.com/spreadsheets/d/1tAGzUiZ9jdORKCnoDQJkOU8tQsZDCZVjcWqXYOSAFmE/edit#gid=276406255

@GeorgLink
Copy link
Member

We released the metrics

I suggest we create smaller issues for new metrics and close this gigantic issue.

Metrics not yet defined:

  • the time to merge
  • the time waiting for a submitter action
  • the time waiting for a reviewer action
  • the time for each of the iterations in a code review process
  • the time to merge into master since a review is approved
  • the time for approval

@germonprez germonprez added the metric idea This idea is open for discussion. Ideally, fill out a detail page to focus the discussion. label Sep 3, 2020
@dicortazar
Copy link
Member Author

+1 to close this ticket.

As I was the one sending this, I'll proceed as follows:

  • Create a new ticket for each of the previous metrics
  • Create their Google doc with the template
  • Close this ticket with references to those new tickets

@dicortazar
Copy link
Member Author

I'm leaving out of this conversation the two last metrics of the discussion:

  • the time to merge into master since a review is approved
  • the time for approval

These are software delivery metrics and may be part of another discussion :).

@GeorgLink
Copy link
Member

GeorgLink commented Oct 13, 2020 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
metric idea This idea is open for discussion. Ideally, fill out a detail page to focus the discussion.
Projects
None yet
Development

No branches or pull requests

6 participants