PROJECT NAME

Trusted Analytics Platform (PaaS)

SKILLS
User Interviews, Internet Data Scrape, User Personas, User Testing, Wireframing, Technical Documentation Writing
STAKEHOLDERS
Intel Corporation

Challenge

I joined the Trusted Analytics Platform team to help enhance the user experience, which involved defining users, conducting usability testing, and improving UI design, user documentation, GitHub repositories, and product demonstrations.

User Interviews

I initiated interviews with subject matter experts within our team and recruited both internal and external data scientists to understand their work contexts, motivations, and expertise levels. These insights were consolidated into user definitions to guide our team.

Users were divided into 3 main segments: Experimentation, Solutions, and Innovation.
Defining the Users: Experimentation Segment
Defining the Users: Solutions Segment
Defining the Users: Innovation Segment

Scraping the Internet for Data

I worked with data scientists to analyze job postings alongside user interviews, aiming to identify popular data science tools and prioritize platform features, considering the possibility that experienced users prefer Command Line Interface (CLI) over Graphical User Interface (GUI) tools.

Methodology
Experience Level Determined by Data Scrape
Job Postings by Degree Type
Goal of Data Scrape
Experience Level and Occurrence of Tools
Terminology by Data Science Level
Languages by Data Science Level
Frameworks by Data Science Level
Tools with UIs by Data Science Level
Tools with CLI by Data Science Level
Data Scrape Conclusions

User Testing

Using Data Scientist User Definitions derived from the interviews and data scraping, I conducted comprehensive user testing on the current interface, evaluating it through various tasks both qualitatively and quantitatively.

User Testing Title Slide
User Testing Structure
Question #1 Documentation Value
Task #1 Data Ingestion
Task #1 Data Ingestion Expected User Flow
Statistical Methodology for Evaluating the Data
Task #1 Results
Task Completion Rate vs. Margin of Error
Mean Task Completion Times
Confidence Interval by User
Benchmark Measures for Future Research

GitHub Documentation

Based on the research, it was identified that documentation was an important part of the experience for data scientists. This prompted the rewrite and reorganization of the wiki documentation for the platform on Github.

VERSION 6.0 (Before Improvements):

https://github.com/trustedanalytics/platform-wiki-0.6/wiki

VERSION 7.0 (After Improvements):

https://github.com/trustedanalytics/platform-wiki-0.7/wiki

Results

The user definitions guided new feature development, and the data scrape led to prioritizing the expansion of Python-based features to reach a wider audience. Insights from usability test sessions led to significant improvements in the web UI and all of the research helped create GitHub Documentation to improve the Out of the Box experience for our target users.

Explore Projects