Measuring Impact#
Measuring impact is more than an exercise in reporting numbers—it is about telling the story of why VERSO matters. From the start, we knew that building an Open Source Program Office in academia was an experiment, and like any experiment, it needed evidence to validate its success. Metrics became our compass, guiding decisions, shaping strategy, and demonstrating value to stakeholders. They helped us answer critical questions: Are we lowering barriers for researchers? Are we fostering a culture of openness? Are our efforts leading to sustainable, impactful projects? This chapter explores how we approached measurement, what we learned, and why data-driven storytelling is essential for sustaining and scaling an initiative like VERSO.
Why Metrics Matter#
From the earliest days of VERSO, we knew that enthusiasm alone would not sustain the program. To justify continued investment—both financial and institutional—we needed to demonstrate tangible impact. Metrics became the language through which we could tell our story, not just to leadership, but to funders, collaborators, and the broader academic community.
Metrics helped us demonstrate value to leadership by showing how VERSO contributed to UVM’s strategic goals. Numbers like the growth in open-source projects, participation in training sessions, and adoption of governance templates provided evidence that the program was making a difference. These data points turned abstract ideas about “openness” into measurable outcomes that administrators could understand and support.
They also allowed us to identify successful strategies. By tracking engagement across different initiatives—such as workshops, community events, and pilot projects—we could see what resonated with researchers and where we needed to adjust. Metrics weren’t just about accountability; they were a feedback loop that informed our decisions and helped us refine our approach.
Finally, metrics were essential for securing external funding. Foundations and federal agencies increasingly expect evidence of impact, and being able to present clear, credible data strengthened our proposals. Whether it was the number of faculty trained, the diversity of disciplines engaged, or the sustainability of projects beyond their initial grant, these metrics told a compelling story of progress and potential.
In short, metrics mattered because they transformed VERSO from an experiment into an institutionally recognized program. They gave us the credibility to grow, the insight to improve, and the leverage to advocate for open source as a strategic priority in academic research.
Key Metrics#
When we set out to measure VERSO’s impact, we knew that the right metrics had to do more than count activities—they needed to capture progress toward cultural change. Our goal was to track not just outputs, but outcomes: Were we making open source easier, more sustainable, and more valued at UVM?
One of the first metrics we monitored was adoption. How many research projects were engaging with VERSO’s services? This included projects that sought licensing guidance, used our governance templates, or participated in pilot programs. Tracking adoption helped us understand the breadth of our reach and identify which disciplines were most responsive to open-source practices.
Next, we looked at engagement. Numbers alone—such as workshop attendance or Slack channel membership—didn’t tell the whole story, but they provided a baseline for gauging interest. Over time, we began to measure deeper forms of engagement, such as repeat participation, contributions to community discussions, and faculty who became champions for open source within their departments.
We also tracked outputs. This included the number of open-source repositories launched with VERSO’s support, the frequency of updates, and the presence of essential elements like licenses, documentation, and contribution guidelines. These indicators showed whether our training and resources were translating into best practices.
The hardest metric was sustainability signals. Were projects still active six months or a year after their initial release? Did they attract external contributors? Were they cited in publications or integrated into other research workflows? These metrics were harder to capture but critical for understanding whether VERSO was helping projects move beyond the “grant-funded prototype” stage into long-term viability.
Here’s an expanded, narrative version of 7.3 Beyond Numbers:
Beyond Numbers#
While metrics provided a valuable snapshot of VERSO’s progress, they could never tell the full story. Numbers alone—how many workshops we held, how many repositories were launched—capture activity, but not transformation. To truly understand our impact, we needed to look beyond quantitative measures and embrace qualitative insights.
One of the most powerful tools we used was storytelling. Case studies of individual projects revealed the human side of open source: a faculty member who, after attending a VERSO workshop, released a tool that became widely adopted in their field; a graduate student who learned open-source practices and leveraged that experience into a career in tech. These stories illustrated outcomes that no metric could fully convey—confidence, empowerment, and cultural change.
We also gathered testimonials from faculty, students, and community partners. Their feedback helped us understand what worked, what didn’t, and why. For example, researchers often told us that VERSO’s licensing guidance removed a major barrier to sharing their work. Others highlighted the sense of belonging they felt through our community events—a factor that can’t be measured in attendance numbers alone.
Finally, we looked at collaborations and ripple effects. When a project supported by VERSO became part of a multi-institutional grant or was cited in a major publication, it signaled that our efforts were amplifying UVM’s research impact. These qualitative indicators showed that VERSO was not just facilitating open source—it was helping to embed openness into the DNA of the university.