Opinion
In project management, there is a concept called the co-manager model that provides equal authority and responsibility for managing the project to both the client and the technical team. This means that the technical product manager has to be willing to share the responsibility of the projects with the client product manager. With this, take a moment to think, if my team is the data science / technical product team, then who is my client product team?
In a previous role, I worked on the technical team with a client that understood how to do data analysis but needed a technical team to develop the machine learning and analytics that fueled their dashboards. To manage this engagement, it became crucial to forming a balance between a technical co-manager and the client co-manager in running the project. Our end goal was to produce dashboards for device health management that would show which devices posed a high risk for problems and required attention compared to acceptable devices.
To handle this connection between the two teams, each week, we met on a call to discuss the project status, next steps, and demos. In the beginning, the meetings were strained between the two parties. The client felt they were not getting what they expected from the product, while the technical team felt they were meeting their goals. This conflict led to a lack of full engagement when the team asked for feedback on the work being done. Within a few months of recognizing these issues, we empowered the client’s product manager to take on more responsibility in the project and begin making decisions based on their business case and their desired outcome.
We did this by breaking down our technical work and challenges in terminology the client could relate to about their device health monitoring. Essentially we were following the Four Principles of Explainable Artificial Intelligence:
Knowledge Limits: The system only operates under conditions for which it was designed or when the system reaches a sufficient confidence in its output.
The dashboards we created were only done for a specific subset of devices that analysts were looking into for device health management. This narrowed the project scope, allowing us to operate under known conditions to create our models. This work was previously agreed upon; we just needed to work on the explainability of our work concerning these conditions the clients were requesting.
Explanation: Systems deliver accompanying evidence or reason(s) for all outputs.
We created dashboards that showed enough details about how the models arrived to conclude that the device was or was not in good health. We allowed analysts to click down to the underlying data to see what the maintenance activities used to arrive at the model’s decision. By allowing analysts to click down into the underlying data, they were given the evidence they needed to support the model’s output. This data provided them with an explanation for the output they were seeing.
Meaningful: Systems provide explanations that are understandable to individual users.
Allowing the analysts to see the underlying data along with the model’s predictions of good or bad health was good, but it needs to be meaningful to the client. Combined, the UI of the different screens was easy to navigate, and each level provided detailed explanations to analysts in their terminology of how the analysis was being formed and how they could understand this output to make a decision about a device grouping.
Explanation Accuracy: The explanation correctly reflects the system’s process for generating the output.
With the explanation and meaning comes the need for accuracy. Once we generated these outputs, we demoed results in weekly calls. It was in these demos that client feedback became important. Through their domain expertise and our models, we could evaluate the results’ accuracy and make changes based on the client’s response to what they were seeing. In addition, we were recreating their manual analyses into explainable machine learning models to increase productivity. The more they understood our modeling, the better the feedback became, as they could provide feature requests and improvements based on their domain knowledge. In addition, it allowed for better collaboration between the teams to improve the overall explainability of the dashboard outputs.
Towards the middle of the contract, our clients better understood the dashboard demos being presented and how the algorithms behind them were making decisions. Through better explainability and Communication, we could use the client’s language to establish an open and honest line of communication. This change meant the client took on more responsibility for the project’s direction and provided valuable feedback on what was working as they expected and what was not. Now that the clients understood what we were presenting, they could provide input backed by their industry domain knowledge and experiences. This feedback allowed us to produce a practical solution that created the expected business value for the analysts using the dashboards. This change also empowered the team to actively participate in all decisions and changes to the requirements to achieve these objectives.
Final Thoughts
Data scientists often work closely with business analysts and clients that need to understand their work and the impact it will have on them. With this, teams need to ensure the client is in the best position to create meaningful and timely decisions that will benefit the project. These actions can be facilitated by explaining your Data Science processes in the clients’ language and ensuring they understand how your algorithms predict events. By providing all this information, you can empower your clients to aid in their decision-making and ownership of the product they support. In addition, it creates a level of ownership for both teams and allows them to feel like they are contributing to the roadmap’s success.
Thanks for reading! I hope you enjoyed reading about what I have learned. If you would like, you can support my writing by becoming a Medium member using this link.