Oracle DX4C Care X

CX SaaS for the Telecommunications Industry


June 2019 – Present
Lead Product Designer

Designers (4), Product Managers (3), Developers (30+), Content Strategists (2)

My contributions
Research, User Interviews, Product Strategy, Motion Prototyping, IxD, VxD, Information Architecture
Design a modern customer experience solution tailored for the telecommunications industry to increase revenue and customer satisfaction.​
We got push back from Product Managers and Developers on doing user interviews since we were on a tight deadline.​


Telco's customer satisfaction rank is the lowest compared to other industries, call center satisfaction being their lowest scoring category.

Service agents spend a lot of time doing manual processes and have a limited time to create rapport with their customers and up-sell products.

[ppwp] Empty content, invalid attributes or values Empty content, invalid attributes or values


There’s still a lot of work to be done, however, a good indicator of success is the fact that the first release and demos have helped Oracle attract some key new customers.
We worked closely with the Human Capital Management team, our Core Design System team, and the central AI team. Our collaboration, among others, was featured during Oracle’s San Francisco Design Week User Experience Award 2021 for bringing state-of-the-art, consumer-grade user experiences to the sophisticated enterprise industry.
Reproducir video

Latest iteration of the Foldout View in motion.

How did we get here?

We gathered requirements from Product Management as a starting point, and did UX research in parallel to understand our user's needs.

Research insights

We studied the current market situation for the Telecom industry compared to other industries, we performed a competitive analysis to find business opportunities, and conducted user interviews and workshops to iterate our designs.


Video streaming services, such as Netflix, have set the bar higher on customer satisfaction for traditional telecom companies.

Telecom customer satisfaction is among the worse compared to other industries. Call center satisfaction being the lowest-scoring category in all types of service:

Wireless service

Internet service

TV service


Competitive analysis

It was a good opportunity to differentiate ourselves from the competition, by simplifying the interaction of a service agent with an enterprise software that has a lot of features and can easily become very complex.



User insights

(Gathered from 20+ hours of user interviews and workshops)

Surprisingly, telecom service agents are not measured on customer satisfaction, instead, they’re measured by sales quota.

Spending too much time fixing an issue reduces the opportunity to create rapport with the customer to gain his trust and make a sale.

Through these workshops, we were able to get some key insights and elaborate the shape of data.
In summary, agents spend a lot of time finding an issue with a service, which impacts the customer experience and reduces the opportunity to up-sell or cross-sell products. While they’re fixing the issue in one monitor, they’re also creating an offer in a second monitor. In cases where a customer doesn’t want to purchase a new product, agents have a fixed budget to keep customers happy and improve customer retention.

Amount of phone calls an agent receives per day







Time of average handling time per customer (minutes)







Percentage of times an adjustment was done through up-selling







Time spent engaging with old legacy systems



Frequency of billing related issues



Time to up-sell or cross-sell products (minutes)

< 7:00


Understanding our users

Service Agent

Telecom Industry

Pain points


Limited time to up-sell products.


Measured on monthly sales quota.


Work with manual processes.

User goals

Service agents need to nurture customer’s and account’s services in order to meet their quota.

Service agents need to understand customer’s needs in order to make the best recommendation possible.

Design process

As we got key insights from our research we moved on to creating a navigation and architecture structure. This allowed us to create a narrative to our story and define our key screens.

Signature moments

One of our first explorations provides a better visualization of the customer’s billing history. This allows the user to quickly identify the issue with a bill and provide recommendations powered by AI.

This also reduces the user’s friction of having to manually identify the problem by comparing bills in two monitors.
Our first iteration was a motion demo that helped us showcase our product strategy to our leadership. We also got very good feedback during our user interviews and Oracle’s sales representatives.
Reproducir video

1st iteration of smart recommendations powered by AI.

We collaborated with the central AI team to consider best practices, such as letting users provide feedback so that the AI can make better recommendations in the future.
Our latest iteration evolved to align with our Design System’s vision and design patterns.

Latest iteration of smart recommendations powered by AI.

Moving forward

As we continued making progress, we were asked to incorporate the Sales and Service app’s design patterns to our product. During that process, we discovered navigation and IA issues.
The main focus of the app was an activity feed that helped drive transactional interactions, but we realized this was not a scalable solution since all the interactions were done inline.

Screenshots taken from the Service product.

Our approach was to take a step back and propose a design that would cover most CX cases. Our solution displayed an overview page that we named Foldout View, where the information was grouped together in pillars to reduce cognitive load, and users could easily navigate between a pillar and a detail page.
Our biggest challenge was to maintain our design intent, which was that the Foldout View should be an overview page and was not intended for heavy-lifting actions. We wanted to stay away from having “everything one click away”.
Reproducir video

1st iteration of the Foldout View in motion.

Our latest iteration follows our signature moment’s narrative. An agent views the latest activity in the account, notices excess in data usage for a mobile plan, explains to the customer why they were charged more compared to previous months, and make an offer recommendation to avoid being overcharged again.
Reproducir video

Latest iteration of the Foldout View.

Reproducir video

Foldout View skeleton animation.

The Foldout View was well received by other teams, so I was asked to design the IxD specs to include it into our Design System for other teams outside of CX to uptake.

The biggest challenge I faced when working with the Core Design System Team was having many too many stakeholders giving feedback and finding a middle-ground that was a scalable solution for every product at Oracle.

Assisted buying use case

Customers often need to contact a real person in order to fulfill a purchase or solve a problem that unassisted channels could not. Although this raises other concerns regarding the consumer-faced products, this product was focused on the agent’s perspective and we had to design for this use case.

1st iteration of an assisted buying flow for a new customer.

Our first approach met the requirements, however, it didn’t align with our vision of designing consumer-grade experiences for the enterprise industry.
We designed a conversational UI with 3 purposes. 1.) Help the user have a fluent conversation with the customer by using a script. 2.) Reduce the time it takes a user to make a recommendation based on customer needs. 3.) Increase user’s productivity through a guided process.

Latest iteration of an assisted buying flow for a new customer.

Introducing 3-week design sprints

As a design team, we learned that a transparent workstream with our cross-functional team improved our productivity, so we started breaking requirements into smaller pieces and focused on completing them constantly and more often. This also prevented too much rework, because everyone was on board with our design decisions and any concerns were addressed at an early stage.


What's next?
Continue iterating after testing our MVP with 3 real life Oracle customers.
Research is not negotiable. The insights we gathered had a mayor impact in our design decisions.
What would I do differently?
I should’ve communicated a transparent work stream with the team from the beginning. At first, our design process wasn’t friendly and caused friction with the rest of the cross-functional team.
Although our design needs to be stressed out by our users, we’ve had a good acceptance ratio by the design community, our leadership, other teams at Oracle, and potential customers.