UXing for a B2B big data company

Case study for client project at General Assembly

Simon Titcombe

--

The final project of my UX design immersive course at General assembly was with a real client, working on a real issue. This was a great experience for me, after completing three concept projects as part of my course it was a good opportunity to apply the skills and knowledge into the real world.

The real world client in this case was a big data and data science specialist. Basically they are a company that provide data analysis for their clients, who are predominently in the public sector.

Working as part of a four person UX team we were assigned a project to look into the user experience of a feature of their system where users collect and share data.

After recieving the brief the team went to meet the client for an intial meeting where we discussed the issue and found out as much information about the project as we could. This meeting was used as a starting point for our research.

Research

To research the issues we looked at competitive analysis and also did user research/interviews.

Competitor analysis was very useful to understand what else was out in the market place in terms of sharing data. We looked at B2C things like google drive and drop box, also B2B sharing platforms like Huddle and Tableau.

For user research we carried out interviews, we talked to users of their current system, data scientists and also people who have used data previously.

Breakdown of interviewees

A number of problem areas were identified from the research and the result were organised using affinity mapping…

Two key areas emerged:

1, data issues — specifically around format and time. Format was things like file types, and contents within the file, an example being different ways of display a blank value, which could be shown as an empty cell, a ‘0’, or a dash. Also global date formats. And also the construction of the elements within an address. Time issues were related to how long it took to resolve issues and having to request data to be resent, perhaps in a different format or layout, or even to match a common schema.

2, Error handling and schemas — users were not always aware of data issues/errors when they were sending data and schemas were often not connected to the data.

Personas & User Journey

The research phase gave us enough knowledge to create two personas, which we used to understand the issue from each perspective; Gisella is the data collector and Stuart is the data sender.

Their experience was mapped to show the story of how data is requested and sent. The user journey below shows the emotional state along with each step of the flow and a break down (task analysis) of each step. Mapping the journey out like this allowed us to pin point two pain areas for Gisella.

The first pain point (unhappy face) is after Gisella receives data and finds that it does not match the schema, this means she has to resend a data request.

The second pain point was when the data is resent it contains errors and thus meaning she has to request data again.

Ideation

At this point we had a good understanding of the issue we wanted to work with and to start the ideation phase we first wanted to frame the issue, this was done using a how might we statement…

How might we ensure that the data collected from respondents is of the same format or schema?

This statement was used to conduct a design studio with the client…

I facilitated very productive design studio session with the client, with a clear focus from the start our creative ideas all centered around four key areas:

  1. Having a main schema/template library as a starting point.
  2. Two file output: schema template (xls) and a rule sheet (pdf).
  3. Positive affirmation throughout the user’s experience.
  4. Two-step upload process (some automatic data checking at this stage, to try and avoid having to resend data).

After the session with the client, back to our team of four, we started to collaboratively sketch ideas that came from the design studio using the whiteboard walls at General Assembly…

The above is the first 3 screens that would form our first paper prototype: (from left to right) Dashboard page, Schemas page, Schema builder page.

This gave us the intial concept ideas that we would use in our first prototype…

Prototyping

Paper prototype pages. Same as drawings above, (from left to right) Dashboard page, Schemas page, Schema builder page.

Our first prototype was created on paper. Initial testing was difficult because the system and task were quite complex. We were testing on users who had not used the current system which added to the difficulty.

We had lots of reoccurring feedback from testing:

  • ‘Schemas’ icon confusion within navigation bar.
  • Where is “Your Schemas”?
  • Why can I rename up here if I have a title box?
  • Where does this share button go?
  • Why would I not have download rights on my own file?

As a group we decided to gain any real value from testing on this project we would need to take the prototype to digital. At this stage we implemented some changes based on user feedback, three of these are highlighted below…

1. (above) The paper prototype showed the icons as they are in the current system. Based on user confusion around these we decided to remove the icon graphics and use text.

2. (above) Average users were not familiar with the word Schemas, so for testing purposes we decided to change the word to Template.

3. (above) Drag and drop functionality was very difficult to test in paper and from user testing it became apparent that having the ability to add fields by clicking the fields would also be useful for the user.

User feedback was used to improve the prototype and we moved the project into digital using Sketch, and then set up a clickable walkthrough using InVision. Testing for the digital low/mid fidelity version started…

Testing a digital version gave us much more user insight compared to the original paper prototype, users were happier using the system on a computer and could also follow insturction better with text labels on the navigation icons.

Two key findings from testing the digital prototype are below.

1. (above) The accuracy to the schema/template was displyed on the dashboard with each file (above left pic) , which didn’t make sense to the user. It made much more sense for this to be displayed after upload on the summary page (above right pic).

2. The process of constructing a schema/template is quite complex, especially for a first time user. So, we looked to give instructions as much as possible. In the picture above you can see a first clear call to action — then, within the rulesheet area and template area, text instructions were added.

At this stage the team felt it was important to implement some of the user feedback into the prototype. Adjustments were made in Sketch/InVision accoring to our findings and then we began a new round of user testing at mid fidelity…

For the mid fidelity digital prototype we bought the quality up, making sure all icons were were clear and consistent, also making sure everything had the same look and feel of the current system.

The picture above shows a change we implimented based on user feedback. Users wanted a quick way to be able to see what the files were within their dashboard. So, at this stage we tested some funcationality that showed a document preview by hovering over the dashboard icons.

The above picture shows a key change we had to impliment as users were not fully understanding the file accuracy check during upload. We wanted to make sure the user did not feel like a score less than 100% meant their file was not uploaded. To help with this, a change we made was to bring the accuracy checking into step 4 of the upload process (previsouly it was seen only on the summary page), you can see this on the picture on the right above. We also used text confirmation so the user knew that the file was uploaded and they had a choice to continue with the checked file or to upload and try again after reviewing an error report.

Teamwork

This project was completed as a four person team, we all shared the workload and got involved at every stage of the process. I was assigned team facilitator, which gave me the added responsiblity to resolve any team conflicts as well as being the point of contact for the client and also our course leader. In order to work well as a team we implemented a kanban board using Trello, here’s a shot of our work in progress part way through the project…

Conclusion

For this project we looked at improving the user experience of a data collect and share feature for a data analytics system. Our research identified two pain points for users that we looked to overcome and four key objectives were born out of the ideation stage.

The two pain points: identifed and shown above in the user journey, were…

  1. Receiving data from different sources where the schema (templates) didn’t match — meaning the user spent time reorganising or re-requesting the data.
  2. Recieving data that contained inconsistencies or data errors.

Our solution allowed the data requester to set up a schema/template at the point where they request the data. We also implemented the file checker during upload, which compares against teh preset schema/template. Both of these should be a solution to the two pain points. The revised user journey below shows the removal of the pain points…

The four key objectives:

  1. Having a main schema/template library as a starting point: This was implimented from the paper prototype stage as a clickable link in the main navigation, which remained throughout testing, text labels had to added to make it clear what the navigation icon was. [see Picture 1 below]. We also made some minor changes to the library page layout, introducing a tabbed layout showing a library, recent and ‘my templates’ section.
  2. Two file output: schema template (xls) and a rule sheet (pdf). This functionality was designed in to the schema builder page, when adding fields (xls) it was automatically creating rules (pdf). This functionality tested well at digital stage once we overcame the dual functionality of dragging and dropping fields as well as being able to click them to add.[see Picture 2 below].
  3. Positive affirmation throughout the user’s experience. Added at every actionable item during the collect and share process. [see Picture 3 below].
  4. Two-step upload process. This was achieved by moving the file checking stage into the four step upload process. It tested well and no one thought that their file had not been uploaded.
Picture 1 (above): Schema/Template icon added to the current systems navigation menu
Picture 2 (above): Two file ouput
Picture 3 (above): ‘Thank you’ & ‘Congratulations’ messages

Next steps

Error reporting feature can be created to assist users identify errors easily and fix.

Response Tracking: Tracking feature could help Requesters keep track of their requests and improve process visibility.

UI & Icons review: Icons, CTA and Menu options are to be reviewed in order to apply aesthetic, consistency & standard principles.

Thank you for reading my article, I am a junior UX designer looking for a full time job, please take a look at my online portfolio here.

--

--