

Copilot action
Clearly indicate user actions and AI actions, ensuring AI supports the workflow as a copilot without disrupting established human workflow.

Voice Trigger
-
Voice activation modal appears when triggered.
-
Motion effects signal that the agent is actively listening.
-
Clear visual cue assures users AI is working.
-
Flow interruption is intentional, so users pause briefly and know the system is waiting for their next command.

AI action side panel
-
Collapsible panel runs alongside the main flow, keeps AI actions transparent while letting users stay focused and in control
-
Responsive updates keep content aligned with the agent action
-
AI interprets speech into tasks, and may contextually trigger secondary actions (e.g., note taking).
-
Chat remains available for quick information access.
-
Rationale: Keeps AI actions transparent while letting users stay focused and in control.

Context-Aware Quick Access
Quick access to AI that is aware of the user’s current context, offering smart shortcuts or recommendations directly within manual flows.

Easy Quick buttons
-
Quick buttons in side bar to trigger chat window or voice anytime.
-
Context-aware AI reads the current page and user action to tailor responses.
-
Smart recommendations surface relevant shortcuts, reducing effort and clicks.

Shortcuts
-
Key AI features embedded as shortcut buttons directly on relevant sections.
-
Reduces learning curve by surfacing AI in context rather than hiding it in menus.
-
Minimizes page switching, streamlining the overall workflow.

Embedded Smart Features
-
AI anticipates needs and suggests context-relevant actions (e.g., available time slots for scheduling, smart tagging and image sorting).
-
One-click execution helps users complete tasks instantly without manual searching and reduce decision making time

Transparency & Human in the Loop
Ensure AI results are traceable and editable, with confidence levels, citations, and tagging, always requiring human confirmation to build trust.

AI Confidence Level
-
AI annotations are clearly marked with distinct icons and colors to differentiate them from human input.
-
Confidence levels are displayed so users can assess reliability and filter results as needed.
-
Supports decision-making by giving users control over which AI outputs to trust.

Citation
-
Citations displayed when AI retrieves or uses external content (e.g., for auto-filling a claim).
-
Outputs tagged as drafts, signaling that results are provisional and need user confirmation.
-
Keeps users in control, ensuring they can verify and adjust AI-generated content.

Human in the loop confirmation
-
All action-oriented AI outputs require user confirmation before final execution, ensuring users validate AI-generated results for accuracy and reliability.
-
Prevents errors by keeping final control in human hands.

Proactive and predictive AI
Beyond passive responses — AI works in the background to anticipate needs, push intelligent notifications, and suggest tasks.

Seamless Next-Step Guidance
-
AI identifies patient-specific next steps (e.g., missing form, follow-up needs, unpaid balance).
-
Embedded recommendations and notification in form of icons appear directly in relevant pages, such as the scheduling view.
-
Supports decision-making by surfacing timely actions without extra clicks.

AI Task Center
-
AI continuously scans workflows to detect pending or missed tasks.
-
Push notifications alert users of their to-do items in real time.
-
Centralized task center collects these items so nothing gets overlooked.

AI-powered dental practice management system
SaaS product providing Intelligence for modern dentistry
My role
Founding product designer
Team
Product designer
Software engineer
Machine learning engineer
Dentist
Dental office administrative
Business development leader
Introduction
Codentist is a startup founded by experienced dentists and ML engineers, built on seven years of clinical research to develop AI models tailored for dental practice scenarios. The SaaS platform combines intelligent diagnostics with workflow automation to transform scheduling, exams, treatment planning, insurance claims, and patient engagement.
My contribution
As the sole designer, I led the 0–1 product design process across desktop and mobile platforms, responsible for feature planning, user experience research, and design execution. I collaborated closely with engineers and clinical experts to ensure accuracy, and also supported marketing and fundraising efforts to drive company growth.
Problem
In over 200,000 U.S. dental offices, doctors and staff remain trapped in outdated, fragmented and manual workflows—dragging down productivity and making the entire treatment journey slow and frustrating for both provider and patients.

Staff shortage
Nearly 1 in 3 dental practices report understaffing. Manual workflows force offices to do more with fewer hands

High operation cost
Inefficient processes further inflate labor and administrative expenses, eating into already thin margins

Low patient satisfaction
25–30% of dental patients report dissatisfaction, most often due to long waits and slow treatment processes
Current impact
$1.2M
Fund raising in seed round
50%
Reduced user click
90%
Users and investors show strong positive views
45%
Projected to reduce average task time compared to previous platforms
Research
Mapping End-to-End, Multi-User Dental Workflows and discover gaps
I conducted 10+ interviews with dentists, front-desk staff, and practice managers. My goal was not simply to transfer pain points into a “happy flow,” but to uncover where AI could accelerate clinic operations and drive tangible business value for future clients. By looking beyond usability to metrics like revenue impact, efficiency gains, and patient retention, I focused on designing workflows that practices would be motivated to adopt.

Define
AI agent in workflow: How can AI solve the problems?
I conducted an in-depth analysis of AI’s capabilities in intelligence and efficiency, working closely with machine learning engineers to understand its potential and boundaries, and mapped how these strengths could be embedded into the workflow design.

Smarter decision powered by computer vision for evidence-backed perception, LLM reasoning for contextual understanding and RL for adaptive optimization.
Intelligence
Evidence-backed, continuously learning radiograph labeling
Improves diagnostic accuracy and trust
Adaptive treatment plan generation based on patient profile and clinician habits
Increases patient acceptance and shared decision-making

Efficiency
Task automation powered by RAG and NLP for rapid information retrieval, speech-to-text for seamless documentation, and robotic workflow excution via APIs
Simplifies and automate repetitive and manual tasks such as scheduling, communication, and note-taking
Reduces administrative workload and lower labor cost
Reduces error-prone processes like insurance claims submission
Accelerates cash flow and improves operational reliability.

Design
Improve Practice Management Efficiency
Voice-Driven Charting
-
AI-powered speech recognition captures doctors’ findings in real time.
-
Optional manual inputs allow users to correct or add details.
-
Human-in-the-loop confirmation ensures outputs are reliable.
-
Visual cues clearly show AI is listening
💡 Problem addressed
Filling charts manually during patient conversations based on observation and imaging is time-consuming
🎯 Goal
Reduce average charting time per patient by ~30%.
AI Radiograph Analysis
-
AI-powered analysis generates evidence-backed annotations.
-
Color-coded overlays provide quick visual recognition of findings.
-
Confidence indicators help users assess reliability and reduce errors.
-
One-click import transfers annotations directly into the dental chart.
💡 Problem addressed
Dentists must examine images, identify conditions, and then re-enter findings into the chart, leading to missed details or redundant work.
🎯 Goal
Reduce radiograph review and charting time by ~30% and keep detection accuracy
Contextual AI note taking
-
Available at any step in the workflow.
-
AI enrichment combines tracked user actions and transcripts with notes and generate with clinic-structured templates.
-
Auto-tagging links notes with related images, documents, or procedures.
💡 Problem addressed
Progress notes are essential for clinical documentation and insurance, but currently require staff to spend 20–30 extra minutes per appointment on manual writing
🎯 Goal
Reduce the manual effort required to produce accurate notes while ensuring they remain reliable for clinical and financial use.
Treatment Plan Generation
-
AI identifies patient financial preferences and historical record, generates tiered plan options.
-
Seamless sharing to the patient with rationales for quick review and consent, streamlining communication
💡 Problem addressed
Treatment plan acceptance often requires lengthy back-and-forth conversations, largely driven by patients’ cost concerns.
🎯 Goal
Increase acceptance rates by 20% and Shorten treatment plan communication time by ~30–40%.
Automatic scheduling
-
Real-time capture of conversations between staff and patients, AI-assisted scheduling quickly adds or adjusts appointments.
-
Automated next-step indicators reduce reliance on manual check and follow-ups.
💡 Problem addressed
A major concern for clinics is the high rate of patient no-shows and last-minute cancellations, which directly lowers productivity and leaves chair-time unused.
🎯 Goal
Minimize manual scheduling effort to improve clinic efficiency and utilization.
Claim generator
-
AI compiles and uploads all required claim documents automatically.
-
Error reduction through structured data extraction and validation.
-
Citation markers show where data comes from, ensuring transparency.
💡 Problem addressed
Insurance claim submission is the most time-consuming and error-prone steps. Rejected claims can interrupt cash flow and often take 2–3 weeks to resolve.
🎯 Goal
Streamline claim preparation and decrease rejection rates

AI design patterns
Across all patterns, AI is integrated into the workflow using a consistent visual language — unified colors, tags, and gradient highlights distinguish AI features from manual actions. This ensures AI feels seamlessly embedded in the experience, clearly visible yet non-intrusive, reinforcing trust while maintaining workflow continuity.
Clearly indicate user actions and AI actions, ensuring AI supports the workflow as a copilot without disrupting established human workflow.
Voice Trigger
-
Voice activation modal appears when triggered.
-
Motion effects signal that the agent is actively listening.
-
Clear visual cue assures users AI is working.
-
Flow interruption is intentional, so users pause briefly and know the system is waiting for their next command.
AI action side panel
-
Collapsible panel runs alongside the main flow, keeps AI actions transparent while letting users stay focused and in control
-
Responsive updates keep content aligned with the agent action
-
AI interprets speech into tasks, and may contextually trigger secondary actions (e.g., note taking).
-
Chat remains available for quick information access.
-
Quick access to AI that is aware of the user’s current context, offering smart shortcuts or recommendations directly within manual flows.
Easy Quick buttons
-
Quick buttons in side bar to trigger chat window or voice anytime.
-
Context-aware AI reads the current page and user action to tailor responses.
-
Smart recommendations surface relevant shortcuts, reducing effort and clicks.
Shortcuts
-
Key AI features embedded as shortcut buttons directly on relevant sections.
-
Reduces learning curve by surfacing AI in context rather than hiding it in menus.
-
Minimizes page switching, streamlining the overall workflow.
Embedded Smart Features
-
AI anticipates needs and suggests context-relevant actions (e.g., available time slots for scheduling, smart tagging and image sorting).
-
One-click execution helps users complete tasks instantly without manual searching and reduce decision making time
-
Ensure AI results are traceable and editable, with confidence levels, citations, and tagging, always requiring human confirmation to build trust.
AI Confidence Level
-
AI annotations are clearly marked with distinct icons and colors to differentiate them from human input.
-
Confidence levels are displayed so users can assess reliability and filter results as needed.
-
Supports decision-making by giving users control over which AI outputs to trust.
Citation
-
Citations displayed when AI retrieves or uses external content (e.g., for auto-filling a claim).
-
Outputs tagged as drafts, signaling that results are provisional and need user confirmation.
-
Keeps users in control, ensuring they can verify and adjust AI-generated content.
Human in the loop confirmation
-
All action-oriented AI outputs require user confirmation before final execution, ensuring users validate AI-generated results for accuracy and reliability.
-
Prevents errors by keeping final control in human hands.
-
Beyond passive responses — AI works in the background to anticipate needs, push intelligent notifications, and suggest tasks.
Seamless Next-Step Guidance
-
AI identifies patient-specific next steps (e.g., missing form, follow-up needs, unpaid balance).
-
Embedded recommendations and notification in form of icons appear directly in relevant pages, such as the scheduling view.
-
Supports decision-making by surfacing timely actions without extra clicks.
AI Task Center
-
AI continuously scans workflows to detect pending or missed tasks.
-
Push notifications alert users of their to-do items in real time.
-
Centralized task center collects these items so nothing gets overlooked.
-
Design
Boost patient engagement and revenue
Interactive 3D Patient Report
-
Interactive 3D report in patient portal to visualize treatments, diagnostics, and progress over time.
-
Embedded education within reports reinforces care instructions and treatment value.
💡 Problem addressed
Traditionally, patients have low visibility into their treatment progress and outcomes, results in low revisit rates and weak loyalty.
🎯 Goal
Increase revisit rates and patient retention
Customizable & Role-Based KPI Dashboard
-
Surfaces the most relevant metrics for each role by default and can be customized
-
AI conversation allows quick queries to adjust dashboard and find specific insights.
-
AI-powered data interpretation and operation optimization suggestions
💡 Problem addressed
Office managers need to monitor office performance regularly, but struggle to extract the right insights from overwhelming datasets from current excel sheet
🎯 Goal
Reduce time spent synthesizing reports and searching for data by
AI design pattern
From Insights to Impact: AI-powered, collaborative process
Design process
How did I understand context and user?

Know patient and doctor experience
By visiting a dental office, I found that scheduling often required multiple phone calls, and the assistant had to record our conversation and enter it into the system — a process that was time-consuming and distracting.

User research - Mapping user journey
Through interviews with dentists, front-desk, and managers, I mapped the user journey, validated my hypothesis, and identified that manual paperwork and guesswork occur not only in visit notes but also in diagnosis, report writing, claim submission

Competitor research
We analyzed PMS systems and AI radiographic tools to spot strengths, gaps, and UX issues. Most PMS platforms had outdated designs and only basic chatbots. This revealed a clear market need for an end-to-end workflow with strong voice+visual+text AI

Feature prioritization
We brainstormed AI opportunities and ranked them with a cost–impact matrix. Top priority went to core workflow features like charting and radiograph analysis, followed by business-value features like AI treatment plans. Lower-priority items are chatbot support and scheduling automation
Design process
Design AI features, and design with AI

Collaborate with MLE and define AI framework
I explored the AI tools and models needed for our design vision. We adopted an orchestrator agent that calls different tools and analysis models while learning from new data, ensuring the system stays reusable and scalable for future iterations.

Define layout, navigation and design system
I built a unified design system for modals, buttons, typography, and variables to ensure visual consistency, allowing both the patient app and clinic desktop system to share a coherent look and feel

Vibe coding
After confirming requirements, I used AI-powered tools to quickly build the first interactive prototypes. This “vibe coding” stage gave us clickable flows early, speeding iteration and aligning the team on a shared product vision.

Quick user testing
I ran rapid tests with staff and doctors, using the prototype to gather feedback on functionality and usability. The interactive flows made requirements tangible, helping participants understand workflows and give concrete suggestions.

Refine design in figma
Using the design system and user testing insights, I refined the vibe coding prototypes in Figma into high-fidelity designs, then iterated through regular critiques and testing with the team and users.

Write design specification
I consolidated requirements into detailed design specs covering interactions, logic, and edge cases, ensuring alignment with engineers and clear documentation for future iterations.
Design process
Accelerate launch, achieve market breakthrough

Support engineer to develop with AI
I collaborated with engineers using AI-driven coding to accelerate MVP delivery. We ensured auto-layout and the design system integrated seamlessly into the MCP server’s agent pipeline.

Supporting BD in investor presentation
Beyond product design, I supported BD by creating demo recordings and investor visuals, reframing the product narrative to highlight business value and communicate its potential to investors.
Keep Iterating Through Feedback
Iteration
Improve user experience in dental chart: Easier to record and consume information

Use of icons for readability & accessibility: Boost efficiency
To reduce reliance on dense text and codes, icons were introduced to make the chart more scannable.

The chart was too text-heavy, making it hard to quickly interpret.
Unified view for treatment plan & current condition
Previously, users had to switch pages to document treatment and conditions. The new flow combines both in one place, with AI shortcuts

Switching between pages to add conditions and plans breaks my flow
Interactive timeline for past records
Replaced date dropdowns with interactive timeline, making historical data easier to navigate and condition changes overtime visible for doctor

To reduce reliance on dense text and codes, icons were introduced to make the chart more scannable.
More realistic 3D rendering of teeth
Enhanced visuals with realistic 3D-rendered teeth to support clearer communication.

To reduce reliance on dense text and codes, icons were introduced to make the chart more scannable.
Iteration
Using design to make treatment plans more acceptable and streamline communication.

Include patient information
Integrated patient info into the planning view with customized patient tags, allow user to create more suitable plans

When making a plan, I need to see the patient’s background — especially financial details
Add recent diagnosis and imaging
Displayed recent diagnostic results and imaging within the planning interface, reducing unnecessary page switching

It’s very frequent to click back to another page just to check the latest diagnosis or images.
Emphasize price: key decision factor
Highlighted pricing more clearly in the UI, ensuring patients and staff can focus on financial transparency

Cost is the main factor in whether a patient accepts plan, I hope to improve treatment plan acceptance rate
Preview expected treatment outcome
Added a bonus feature to generate a visual preview of expected outcomes, making plans easier for patients to accept.

Patients often lack understanding of treatment plan, I hope discussing with them can be more engaging
Iteration
Image Center: Smarter Viewing, Richer Insights

Re-design right-side panel with foldable cards
Designed multiple card widgets including AI toggle, AI findings, metadata note-taking, offering richer detail and a more intuitive layout

The AI toggle feels too simplistic — I want to see details related to this image
Optimize history image browser
Improved the history image switching into a folder-style browser with filters, allowing users to quickly switch images within the viewer instead of go back to image center and select another one

Switching between past images is slow — I need a faster way to review history
Improved image comparison
Optimized the comparison mode to allow cross-type image viewing with more screen space, supporting real diagnostic workflows.

I often look at intraoral photos alongside FMX, comparing different images helps me make better decisions
Accessible and emphasized buttons
Added text to buttons for clarity and emphasized the CTA button for Save & Export to Dental Chart

I’m not always sure what some of these actions mean — the icons alone are confusing
Iteration
Enrich role-Tailored Dashboards for operation oprimization

Role-based default dashboard
Deeply understood the workflow of different role and designed role-based dashboards that surface the most relevant widgets by default, aligning with each user’s workflo

I want to open the dashboard and instantly see what I need to do now: show me the most important things first.
Customizable and widget-based dashboard
Enabled customization and allowing users to arrange widgets, so managers and staff can tailor their dashboards to the most meaningful data. I keep enriching this widget library to cover everything users need

I’d like more widgets and the freedom to choose what goes on my dashboard

Takeaway
Reflection on AI design
Human workflow comes first
AI should never be added for its own sake. Before introducing AI, it’s essential to understand the human workflow, identify pain points, and then integrate AI to the space where can bring value. We didn’t reinvent clinical routines; instead, we use AI to solve problem and optimize existing flows.
Know AI’s capabilities
In this project, multiple AI models — voice recognition, computer vision, and reinforcement learning — interacted across workflows. Designers need to understand how these systems operate and complement each other, so we can translate technical capabilities into technical feasible features.
Respect AI’s limitations
Transparency and human autonomy are critical — especially in healthcare — where AI must support, not replace, clinical judgment. AI must always show credibility and transparency — from citation markers to confidence levels — while giving users the power to review, modify, and trace back results.
My experience in using AI to accelerate design & development
In this project, I explored vibe coding and collaborated with engineers who used AI-assisted coding. This enabled a new workflow of quick prototyping, fast testing, and iterative validation. My process evolved into: vibe coding → rapid testing with team → refining with design system → finalizing in Figma for seamless handoff via MCP agent. This accelerated loop is especially vital in a startup environment.