Product Design Internship 2023
Launching A Career Assessment Tool For Clinical Research Coordinators
Emory University and Emory Healthcare encountered a surge in unsuitable applications for CRC roles due to unclear role requirements. In response, we designed and launched Talent Trace, a solution guiding candidates to identify CRC positions that align with their skills and qualifications.
TIMELINE
Dec 2023 - Current
Role
Strategy, Research,  
Design
COLLABORATORS
1 Project Manager
5 Developers
TOOLS
Figma, Figjam, SurveyJS
Problem and Solution
Over-application for mismatched CRC levels
Principal Investigators (PIs) at Emory University face a unique challenge: the specific needs for Clinical Research Coordinator (CRC) positions vary greatly across research projects. This variability often confuses applicants, leading many to apply for multiple CRC levels indiscriminately and resulting in an excessive number of applications. Addressing this issue is key to simplifying the recruitment process.
Image: Applications & Positions for CRC I, II, III and IV atEmory University in 2022
How might we guide applicants to apply for the right CRC level with confidence and ease?
Talent Trace simplifies the CRC job application process. Through a series of tailored questions, it helps users identify the most appropriate CRC positions for their skills and aspirations. This system also offers resources for higher-level opportunities.
Product Alignment with Clients
How to screen candidates for different CRC levels?
I developed a flow chart to streamline candidate screening for CRC levels I to IV, leveraging key metrics like education, work/research experience, and certifications (provided by clients). This visual tool not only enhanced communication with clients but also effectively aligned their expectations with the logic underpinning our product development.
Note: Logical details are blurred for reasons of confidentiality.
How to navigate users to the assessment tool?
Talent Trace, an MVP product independent from Emory University's icims job application system, empowers users to accurately identify suitable CRC levels for application. Future integration plans include embedding Talent Trace into the job application process, where it will automatically screen applicants and provide tailored recommendations.
MVP Prototype & Pilot Test
Goals of the pilot test
I created rapid prototypes of the MVP product to test with users. We recruited and tested with 5 Emory University CRC I professionals. The test aimed to refine design elements for user experience, determine motivations for assessment participation, and ensure clarity and trust in the results.
1. Decide the feeling to convey through design
2. Identify motivators for assessment participation
3. Ensure trustworthiness of results
4. Ensure clarity and usability in question formats
01
Discoverability
Our primary target audience comprises new managers who are keen on honing their soft skills to navigate their managerial roles. These individuals seek to identify and bridge their skill gaps, gauge the efficacy of their practices, and adeptly handle various work scenarios, such as conflicts and disagreements among team members.
02
Who are the target audience we were building for?
Our primary target audience comprises new managers who are keen on honing their soft skills to navigate their managerial roles. These individuals seek to identify and bridge their skill gaps, gauge the efficacy of their practices, and adeptly handle various work scenarios, such as conflicts and disagreements among team members.
03
Who are the target audience we were building for?
Our primary target audience comprises new managers who are keen on honing their soft skills to navigate their managerial roles. These individuals seek to identify and bridge their skill gaps, gauge the efficacy of their practices, and adeptly handle various work scenarios, such as conflicts and disagreements among team members.
Major findings and insights
The user test revealed important insights, despite minimal usability issues due to familiar product format.
Balancing efficiency with engagement
While users prioritize quick assessment completion, they also showed interest in features that offer more than just efficiency. Adding elements of enjoyment can enhance the overall experience in their job-hunting journey.
Optimizing selection processes
Users faced challenges with the overwhelming options in dropdown lists, suggesting a need for a more streamlined selection process. This will ensure ease of use and accuracy in their responses.
Transparent results and justifications
A critical finding was the need for transparency in the assessment outcomes. Users wanted to know why they were deemed qualified or not, which is essential for building trust in the product and preventing unqualified applications.
Layout Feedback - Efficient Or Fun
Dynamic User Flow
3 layouts - pros and cons for each one and user's feedback. The result - why we decided to choose one of them.
3 layouts - pros and cons for each one and user's feedback. The result - why we decided to choose one of them.
3 layouts - pros and cons for each one and user's feedback. The result - why we decided to choose one of them.
3 layouts - pros and cons for each one and user's feedback. The result - why we decided to choose one of them.
✨ Design Explorations & Iterations ✨
1. Balancing efficiency with engagement
A critical aspect of the design involved determining the optimal arrangement of sections and questions. I explored and tested 3 layouts and interactions, tailoring them to user needs ranging from efficiency to ease of use.
Vertical layout
Maximum efficiency
Matches users’ expectations of job application product formats
× May appear more lengthy despite its efficiency
× Increases the risk of missing questions
The vertically scrollable form with a left-side navigation bar. All questions in a section are listed on one page.
Horizontal cards
Enhances concentration on one item at a time
Works well with the multiple sections structure of the survey
× Requires more clicks to complete
× Users may not be accustomed to this mode and interaction
Questions in each section are displayed on a horizontally scrollable page, navigable via left and right arrows. Users proceed to the next section by clicking a 'next' button.
Vertical layout with screening questions
Offers an efficient yet easy experience
Enables users to answer screening questions first - then delve deeper or skip
Resonates with general online form expectations, though differs from typical job application layouts
Begins with screening questions for each section, focusing users on these before transitioning to a standard vertical scroll for subsequent questions.
2. Adapting components with numbers of options
To address the challenge of selecting from diverse length of option lists, I implemented components including checkbox/radio buttons, dropdown and listbox for different number of options:

1. < 5 options: radio buttons or check boxes
2. 5 - 15 options: dropdown
3. > 15 options: list box (allow users to type to search for options)
Dropdown
Idael for fewer options - save space
× Challenging to browse, add, or remove items in longer lists.
× Prone to being closed unintentionally.
Best for under 5 - 15 options, offering a space-saving solution. However, they become less user-friendly for lists exceeding 15 items due to navigation and selection challenges.
Listbox
Users don't need to click on anything to reveal options inside
Allow users to see multiple options at once
The search funtion allow users to find options quickly in a long list
Users can click on the items enclosed in the container box to select one or many from the list - without a need to click anything to reveal options.
Users don't need to select "Other" to add options that aren't in the list. They can simply type to add it.
Quick add feature for adding options that are not in the list.
3. Facilitating navigation in lengthy forms
The education section, the longest part of the form, allows users to add up to five degrees, each with 5-7 questions. User testing revealed this section as notably challenging to complete, with a higher risk of overlooking required questions.
Vertical layout
Streamlined if all questions are answered correctly
× Easy to miss questions
× Difficult to track which questions correspond to which degree
Consolidates all degree-related questions on one scrollable page, ideal for simplicity but potentially overwhelming for multiple degrees.
Paginated layout
Aligns questions with their respective degrees
Reduces length of individual pages
× Need extra clicks to navigate through pages (degrees)
Distributes questions across multiple pages, segmented by degree. This approach enhances organization but increases navigation effort.
Vertical layout with navigation
Retains the streamlined nature of vertical layout
Questions clearly linked to specific degrees
Indicates the finished sections
Facilitates navigate through degees
× More intricate to implement
Consolidates questions into a single scrollable page, complemented by a left-side navigation bar. Users can click links to jump to specific degree sections, with animations providing intuitive layout cues.
4. Clarifying result justifications
User feedback indicated that assessment outcomes (qualified or not for CRC positions) lacked persuasiveness. The key issue was the absence of clear explanations for these results. To improve user trust and understanding, it’s essential to incorporate detailed reasons behind each outcome, directly addressing users' queries about their assessment performance.
Result - Qualified or not
Clear communication of assessment results
× Users are unclear on reasons for disqualification
× Too wordy
Simply indicated 'qualified' or 'not qualified' without further details
Explainary result
Explains results clearly, fostering trust in the result and the product
Offers actionable tips for attaining higher CRC levels
Directs users to appropriate CRC positions or advancement resources - achieve business values
Now elucidates why users receive their specific results. Additionally, it offers actionable tips and resources for advancing to higher CRC levels.
Mid Point Result
Clickable Prototypes
I developed clickable prototypes reflecting feedback from user tests. These prototypes showcase three key flows: #1. signing up for the study, #2. taking and submitting diaries, and #3. tracking progress and compensation.
Hand-offs to Development
Responsive design implementation
Considering job applications via mobile phones, the prototype is responsive to various screen sizes, including desktops and mobile devices. I established breakpoints for each and provided the development team with a responsive grid system.
Handoff showcase
This is a showcase of how I handoff the design spec to the engineer team.
Learnings and Reflections
Micro-interactions matter in UX design
Even in simple products like an assessment tool, the details matter. Standard formats often overlook specific user needs. Thoughtful design decisions — from navigation placement to required field indicators — enhance user experience. Micro-interactions play a crucial role in communicating system status and guiding users effectively.
Be transparent with your users
Transparency fosters trust and respect, particularly in AI-driven products. Clearly explaining product logic and algorithms is crucial. While business constraints may limit full disclosure, translating complex logic into user-friendly content is essential for clear communication and understanding.