Redesigning the

Write a Review Form



Lead designer for a lab focused on product detail page features on that help the user gain confidence in purchasing the product. 70% of customers who view the product details page read reviews, proving a fact we all know: reviews are a crucial part of the customer's decision making process. With a growing catalog we needed to update the form to make it easier for our customers—a foundational piece in our 2019 roadmap. 

Prototyping Tools: Axure RP 9 & Sketch


As the lead product designer for the lab, I worked closely with the Product Manager: I provided all design documentations from the discovery, design and delivery phases.  I worked closely with a UX researcher by providing competitive analysis and scripts for rapid testing, wrote use cases with my Product Manager, provided specs to developers, and was also highly involved with quality assessment processes.


The Challenge:

Usability Improvement

1. Star Ratings

One problem we noticed was if the user skipped rating the stars and tried to submit, the form would throw an error but would not scroll the user back to the field yielding that error.

2. Fit Sliders

85 - 90% of the sliders had little to no interaction and on user testing, customers had mixed opinions about the sliders. Some did not know how to interact with them. This part of the form is also unavailable on Mobile.

3. Nickname

On user testing, users ranked this as the least valuable information to know. One user said that "it doesn't matter who wrote it."

4. Optional Fields  & Unnecessary Questions

98% of the optional questions are left blank by users. On our rapid testing, 5 out of 5 users naturally didn't open the accordion.

Previous Review Form: Mobile View


Previous Review Form: Desktop View


If  we simplify the form to remove optional (irrelevant to users and uninformative for internal analysis) questions and make the form intuitive and simple to complete,


then the customer will be more likely to submit a review when they opt into the write a review form


Resulting in an increase in review submission rate.


The Solution: Simplify

Our high level goals was to increase interaction with review forms, number of review submitted, and overall review submission rate. Based on qualitative research conducted, we knew we had to simplify the UI of the form, update the questions and remove any unnecessary questions that added cognitive load. 

Progressive Disclosure

Instead of exposing the all the form fields all at once, we made the user focus on one task–to rate the product with the number of stars. This prevents the user error of skipping a task before submission. With each star, the comment field label dynamically changes as follows:

5 stars — Yay! What did you like?

4 stars — What stands out?

3 stars — Tell us what you liked or didn't like.

2 stars — Tell us what we could improve.

1 star — What could have been better?

Character Counter

Before the redesign, the form indicated the maximum character count (1000). If the user doesn't meet the minimum 50 character requirement, an error feedback happens after submitting the form which is extremely frustrating. By showing a character counter, the user receives instantaneous feedback to know that they've reached the minimum needed to complete the review.


Screen Shot 2020-01-30 at 9.17.29 PM.png

To make the form as frictionless as possible, we found a way to autofill a mandatory field when we had data available. Since users need to log in prior to writing the review, and the data was already a part of backend call for similar data, we were able to fill the nickname field with the users first name as a the default to reduce another field to fill.  If the user wants to change it, the name is editable.

optional questions.jpg


For this MVP, we decided to launch a generic form without any optional & hidden questions. We removed the question: "Would you like to recommend this product?" which is redundant with the star rating.


Add Delight

After the user submits the form, we added fireworks as a micro interaction to enhance the feedback upon completion. I brought this idea to the team and was properly executed by a developer with the direction of my creative partner.


We a/b tested this form against the existing form and based on 95% confidence we saw a significant increase in review submission by +1% which translated into a projected 1 million annual lift in overall reviews. We scaled this feature pre-holiday 2019, and it is now the new control that we will be testing custom forms against in upcoming a/b tests. The custom forms target specific key categories that would benefit from adding contextual information to reviews that customers read: clothing, beauty, boots & shoes.

Next Steps

Other ideas we have in our backlog that we would love to iteratively test include:

A.  Revealing the entire form with the new design

B. Progressively capturing partial submission at any point the user abandons the review form

C.  Static label vs. Dynamic label on the comment field

D. Review Incentivization

Updating Optional Questions

The next phase in our roadmap is to design treatment arms for each key categories to figure out the key contextual information that the user will find informative while reading reviews. Since we've scaled the previous form, these treatment arms will be tested against, the generic form for their respective categories. Our goal is to increase the quality of reviews, reduce returns, increase ATB by supporting consumer confidence while ensuring we maintain the review submission rate.


Clothing Form

In partnership with our UX research team, our lab conducted extensive qualitative research for all custom forms. Included in that research was a competitive study between, and Our customers told us that they like to read reviews with "skimmable fit" content. We simplified the old slider which had 7-scale points to 5 scale-point into radio buttons and validated this intuitive to all users via rapid testing. We were excited to include height and weight ranges to add context when a user says "the sweater was too long!" —information to which all users who participated in our studies were willing to provide.

See Clothing Competitive Study

Beauty Form

After a competitive user testing between, &, our customers told us that skin type, shade, and age range are helpful information for them to narrow down the usefulness of the product based on similarities.  Inspired by FentyBeauty, we were able to narrow down skin tone into 4 categories which all users were able to self-identity during rapid testing.

See Beauty Competitive Study


Shoes & Boots Form

After a competitive user testing between, &, our customers told us that overall fit, comfort, width and calf width are helpful information for them to narrow down the usefulness of the product based on similarities. We used similar design patterns with the 5-point scale radio buttons and I collaborated with UX content for copy & rapid testing.

See Shoes Competitive Study

Key Learnings

1. Articulate Form Labels Matter Most

When we did rapid testing on the optional forms, we encountered a pattern of users asking with confusion on the label "Review Headline" so we had to change it to "Add a Title" Most users did not read the helper text and after we changed the label, users were able to understand it better.

3. Size & Layout Consistency Affect Readability

One problem we noticed  during rapid testing the optional questions for the beauty form was that people skipped the skin tone and eye color portion. I realized that because the buttons were much smaller than the radio buttons, most users told us that they didn't even notice it. 

2. Mental Models are Effective

For the sliders with a 7-point scale, we tested a 3 point scale version of the radio buttons which changed the layout. During rapid testing, users tried to drag the central button with the expectation of a slider functionality. Once we updated it to radio buttons, users completed the task successfully.

4. Collaboration = More Ideas

The design for the skin tone portion changed from a series of iterations that I worked with our product manager and brand designer. From gradients, to using icons of face and eyes—we reached a final UI that became readily understood by our users through rapid testing.

Additional Credits


Year: 2019

Product Manager: Audrey Caldwell

Lab Manager: Hillel Familant

UX Researcher: Gargi Godbole

UX Content: Tracy Reppert

Creative: Mark Parra

Developer: Rohit Maddula

Developer: Lily Nguyen