IBM Planning Analytics
IBM Planning Analytics
Financial planning for Enterprise and SMB
Financial planning for Enterprise and SMB
IBM Planning Analytics, powered by TM1, is a business performance management solution that enables organizations to create tailored planning, budgeting, and forecasting applications. Built on a high-performance in-memory database, it combines collaborative planning capabilities with interactive analytics and comprehensive reporting features.
IBM Planning Analytics, powered by TM1, is a business performance management solution that enables organizations to create tailored planning, budgeting, and forecasting applications. Built on a high-performance in-memory database, it combines collaborative planning capabilities with interactive analytics and comprehensive reporting features.
Company
IBM
Industry
Enterprise B2B
Contribution
UX Design
UX Research
Tools
Figma
Photoshop
Illustrator


Defining the project
Kicking off in May 2023, I joined a cross-functional team as the Lead Designer tasked with reimagining the IBM Planning Analytics trial experience. The existing trial wasn't meeting user needs effectively, especially for first-time users as the complexity of the product overloaded the trial user with capabilities and left users confused on where to start. Based on data, the previous trial experience had been severely underperforming. Only 3 users out of 1,769 converted in the past year, guided tour completion rates were essentially zero at 0.003% (despite 50% of users starting the process), and only 14.5% of users (1,730 out of 11,894) completed the basic sign-up form.
Our mission was clear: create an intuitive, personalized experience that would showcase the product's capabilities while reducing the learning curve. Our key objectives included:
Designing a more engaging onboarding process that would capture user interest within the first few minutes of interaction
Highlighting Planning Analytics' AI integration capabilities to demonstrate how automation could streamline financial planning workflows
Improving time-to-value for new users by allowing them to experience meaningful results in their first session
Providing a taste of the product that would appeal to different user types from financial analysts to executives
Converting trial users into customers by demonstrating clear value that addresses specific pain points in their planning processes.
The added challenge to this project lay in the complexity of the product and the diverse skill levels of potential users. We needed to strike a balance between simplicity and depth to appeal to both novice users and financial planning experts.
Kicking off in May 2023, I joined a cross-functional team as the Lead Designer tasked with reimagining the IBM Planning Analytics trial experience. The existing trial wasn't meeting user needs effectively, especially for first-time users as the complexity of the product overloaded the trial user with capabilities and left users confused on where to start. Based on data, the previous trial experience had been severely underperforming. Only 3 users out of 1,769 converted in the past year, guided tour completion rates were essentially zero at 0.003% (despite 50% of users starting the process), and only 14.5% of users (1,730 out of 11,894) completed the basic sign-up form.
Our mission was clear: create an intuitive, personalized experience that would showcase the product's capabilities while reducing the learning curve. Our key objectives included:
Designing a more engaging onboarding process that would capture user interest within the first few minutes of interaction
Highlighting Planning Analytics' AI integration capabilities to demonstrate how automation could streamline financial planning workflows
Improving time-to-value for new users by allowing them to experience meaningful results in their first session
Providing a taste of the product that would appeal to different user types from financial analysts to executives
Converting trial users into customers by demonstrating clear value that addresses specific pain points in their planning processes.
The added challenge to this project lay in the complexity of the product and the diverse skill levels of potential users. We needed to strike a balance between simplicity and depth to appeal to both novice users and financial planning experts.
Design process
To ensure a structured approach to this complex redesign, we established a four-phase process that would guide our work from concept to completion:
- user definition
- accelerated design sprint
- design, development and research
- launch and iteration.
This helped us define a clear roadmap while allowing for the flexibility needed to respond to insights gained along the way. It also helped set expectations with stakeholders about the project timeline and deliverables at each stage.
To ensure a structured approach to this complex redesign, we established a four-phase process that would guide our work from concept to completion:
- user definition
- accelerated design sprint
- design, development and research
- launch and iteration.
This helped us define a clear roadmap while allowing for the flexibility needed to respond to insights gained along the way. It also helped set expectations with stakeholders about the project timeline and deliverables at each stage.
User definition
Our first phase focused on identifying and understanding our primary target users. Based on in-depth research data carried for this project, there were 4 persona categories that Planning Analytics users fit into:
Caretaker - They are responsible for the success of the solution they proposed by translating business goals to technological solutions. They work with producers and vendors during initial set-up.
Contributor - These are business professionals such as Analysts, Line of Business managers and Budget Owners, who enter data or use dashboards to run analyses for Producers and Consumers.
Consumers - This user archetype is often associated with leadership roles such as Directors and Executives, who translate targets to actionable processes
Producers - These user types create dashboard to collect data from other Producers as well as contributors. They plan targets using inputs from monitoring and predicting.
This foundational work helped us determine the appropriate starting point for the new product experience and ensured we were designing with real user needs in mind rather than assumptions.
Our first phase focused on identifying and understanding our primary target users. Based on in-depth research data carried for this project, there were 4 persona categories that Planning Analytics users fit into:
Caretaker - They are responsible for the success of the solution they proposed by translating business goals to technological solutions. They work with producers and vendors during initial set-up.
Contributor - These are business professionals such as Analysts, Line of Business managers and Budget Owners, who enter data or use dashboards to run analyses for Producers and Consumers.
Consumers - This user archetype is often associated with leadership roles such as Directors and Executives, who translate targets to actionable processes
Producers - These user types create dashboard to collect data from other Producers as well as contributors. They plan targets using inputs from monitoring and predicting.
This foundational work helped us determine the appropriate starting point for the new product experience and ensured we were designing with real user needs in mind rather than assumptions.






Accelerated design sprint
With our user persona now defined, we followed up with an intensive 5-day design workshop. This collaborative jam brought together team members from product management, development, content strategy, and design.
Each day followed a structured approach:
Our mornings were dedicated to collaborative ideation and exploration, where every team member contributed fresh perspectives.
Afternoons shifted to rapid prototyping and refinement as we transformed our ideas into tangible concepts.
At the end of each day, we presented our explorations to stakeholders for immediate feedback and alignment, ensuring we remained on track with business objectives while pursuing innovative solutions.
Through a wind-tunneling process applied by our cross-functional team during the workshop, we defined a comprehensive user flow showcasing the different ways Shani, our defined primary user, could navigate and utilize the product. One of the main consensus points that emerged from the jam was the importance of incorporating AI to enhance the overall user experience. The team aligned on leveraging AI capabilities to make Shani's workflow more efficient by providing intelligent suggestions, automating repetitive tasks, and surfacing relevant insights based on her planning activities.
With our user persona now defined, we followed up with an intensive 5-day design workshop. This collaborative jam brought together team members from product management, development, content strategy, and design.
Each day followed a structured approach:
Our mornings were dedicated to collaborative ideation and exploration, where every team member contributed fresh perspectives.
Afternoons shifted to rapid prototyping and refinement as we transformed our ideas into tangible concepts.
At the end of each day, we presented our explorations to stakeholders for immediate feedback and alignment, ensuring we remained on track with business objectives while pursuing innovative solutions.
Through a wind-tunneling process applied by our cross-functional team during the workshop, we defined a comprehensive user flow showcasing the different ways Shani, our defined primary user, could navigate and utilize the product. One of the main consensus points that emerged from the jam was the importance of incorporating AI to enhance the overall user experience. The team aligned on leveraging AI capabilities to make Shani's workflow more efficient by providing intelligent suggestions, automating repetitive tasks, and surfacing relevant insights based on her planning activities.


The accelerated design jam allowed us to quickly align on user needs and business goals while generating numerous concepts and immediately testing their viability. The approach secured stakeholder buy-in throughout the process rather than at the end, and established a solid foundation for the full design phase that would follow.
By the end of the week, we had consensus on a direction that would gamify the experience through guided tasks, giving users a sense of accomplishment while learning the platform's core capabilities.
The accelerated design jam allowed us to quickly align on user needs and business goals while generating numerous concepts and immediately testing their viability. The approach secured stakeholder buy-in throughout the process rather than at the end, and established a solid foundation for the full design phase that would follow.
By the end of the week, we had consensus on a direction that would gamify the experience through guided tasks, giving users a sense of accomplishment while learning the platform's core capabilities.
Design, development and research
Following the sprint, I developed a series of design explorations centered around gamification principles. The initial concept featured three personalized paths:
Self-exploration for power users
Guided exploration highlighting key features for regular users
Step-by-step walkthroughs for novice users
I started my explorations by creating several low-fidelity wireframes of the possible personalization experience, mapping out how users would be directed to the most appropriate path based on their self-identified expertise level and goals. These explorations focused on making the initial decision point intuitive while setting clear expectations for each path.
Following the sprint, I developed a series of design explorations centered around gamification principles. The initial concept featured three personalized paths:
Self-exploration for power users
Guided exploration highlighting key features for regular users
Step-by-step walkthroughs for novice users
I started my explorations by creating several low-fidelity wireframes of the possible personalization experience, mapping out how users would be directed to the most appropriate path based on their self-identified expertise level and goals. These explorations focused on making the initial decision point intuitive while setting clear expectations for each path.












Additionally, I developed several homepage experience concepts that focused on helping users achieve their specific goals immediately upon entering the trial, with designs that prioritized clear entry points to common tasks, surfaced relevant templates, and provided contextual help based on typical user objectives when trying the platform. I also explored the use of Carbon's novice-to-pro onboarding components to help users navigate each experience easily. These components provided consistent patterns for tooltips, progress indicators, and guided overlays that could adapt based on the user's growing familiarity with the product, ensuring that assistance was available when needed but wouldn't impede more experienced users.
Additionally, I developed several homepage experience concepts that focused on helping users achieve their specific goals immediately upon entering the trial, with designs that prioritized clear entry points to common tasks, surfaced relevant templates, and provided contextual help based on typical user objectives when trying the platform. I also explored the use of Carbon's novice-to-pro onboarding components to help users navigate each experience easily. These components provided consistent patterns for tooltips, progress indicators, and guided overlays that could adapt based on the user's growing familiarity with the product, ensuring that assistance was available when needed but wouldn't impede more experienced users.












I also added explorations showcasing how users could interact with the Planning Assistant, a generative AI feature in the paid product, to generate a report. This helped demonstrate how AI could aid in simplifying complex processes by allowing users to describe what they wanted in natural language and having the system create a starting point they could then refine.
In collaboration with another designer, we expanded on the user experience with the Planning Assistant. Together, we explored designs showing a system that could provide intelligent insights based on user data, suggesting next steps and proactively suggesting relevant features and functionalities based on each user's unique need. This AI-powered guidance system was geared towards making the complex Planning Analytics platform more approachable by anticipating user needs and reducing cognitive load.
I also added explorations showcasing how users could interact with the Planning Assistant, a generative AI feature in the paid product, to generate a report. This helped demonstrate how AI could aid in simplifying complex processes by allowing users to describe what they wanted in natural language and having the system create a starting point they could then refine.
In collaboration with another designer, we expanded on the user experience with the Planning Assistant. Together, we explored designs showing a system that could provide intelligent insights based on user data, suggesting next steps and proactively suggesting relevant features and functionalities based on each user's unique need. This AI-powered guidance system was geared towards making the complex Planning Analytics platform more approachable by anticipating user needs and reducing cognitive load.












After defining an optimal flow, based on the task journey proposed by the product management team, we collaborated with the research team to carry out Rapid Iterative Testing and Evaluation (RITE) - a fast-paced feedback and iteration approach to help improve the experience based on user feedback. The research aimed to understand how the proposed experience performed with prospective customers while also determining the time to value and how users reached their "aha moment" with the trial.
Each research session lasted 60 minutes and focused on Financial Planning and Analytics (FP&A) professionals who had never used Planning Analytics before. Results were very encouraging, with 83% positive feedback - users stated they could clearly see the value in the product and had a great overall experience using it. However, users highlighted that the number of clicks require before being directed to a mission could potentially cause friction with prospective users who are looking to dive straight into the product and make decisions.
Based on these insights from research and stakeholder feedback, we pivoted away from the three-path personalisation approach. Instead, we created an experience where all new users will first go through a task before exploring the product. This was aimed at addressing the potential friction highlighted from the research sessions while also reducing the barrier to entry by showing how the product features work for each task use-case, helping to provide a consistent foundation before self-exploration.
After defining an optimal flow, based on the task journey proposed by the product management team, we collaborated with the research team to carry out Rapid Iterative Testing and Evaluation (RITE) - a fast-paced feedback and iteration approach to help improve the experience based on user feedback. The research aimed to understand how the proposed experience performed with prospective customers while also determining the time to value and how users reached their "aha moment" with the trial.
Each research session lasted 60 minutes and focused on Financial Planning and Analytics (FP&A) professionals who had never used Planning Analytics before. Results were very encouraging, with 83% positive feedback - users stated they could clearly see the value in the product and had a great overall experience using it. However, users highlighted that the number of clicks require before being directed to a mission could potentially cause friction with prospective users who are looking to dive straight into the product and make decisions.
Based on these insights from research and stakeholder feedback, we pivoted away from the three-path personalisation approach. Instead, we created an experience where all new users will first go through a task before exploring the product. This was aimed at addressing the potential friction highlighted from the research sessions while also reducing the barrier to entry by showing how the product features work for each task use-case, helping to provide a consistent foundation before self-exploration.


Finally, I revamped the visual representation of task cards by replacing the generic imagery with actual screenshots of each task's content. This update directly addressed user feedback from our research sessions, where participants expressed confusion about the disconnect between card thumbnails and the actual task content. By creating this visual alignment between the card preview and the experience itself, users gained a clearer understanding of what to expect before clicking. When we tested these updated thumbnails in subsequent sessions, participants responded positively, noting that the new approach helped them make more informed choices about which tasks to explore.
Finally, I revamped the visual representation of task cards by replacing the generic imagery with actual screenshots of each task's content. This update directly addressed user feedback from our research sessions, where participants expressed confusion about the disconnect between card thumbnails and the actual task content. By creating this visual alignment between the card preview and the experience itself, users gained a clearer understanding of what to expect before clicking. When we tested these updated thumbnails in subsequent sessions, participants responded positively, noting that the new approach helped them make more informed choices about which tasks to explore.


Product launch and iteration
We successfully launched the new trial experience in September 2023, receiving positive feedback from both users and internal stakeholders. Rather than considering the project complete, I continued iterating on the design through close collaboration with development and product management.
Some key iterations included adding more guided tasks to showcase additional use cases, which expanded the product capabilities users could experience firsthand. We refined the task flow based on user analytics, making adjustments where we saw points of confusion or drop-off. The visual design elements were enhanced to improve clarity, ensuring users could easily understand their options and next steps.
A significant enhancement was the introduction of a navigation widget that displays bite-sized steps for completing each task. This feature includes a "show me" button that visually indicates exactly where users need to interact within the interface, reducing confusion and supporting learning.
We successfully launched the new trial experience in September 2023, receiving positive feedback from both users and internal stakeholders. Rather than considering the project complete, I continued iterating on the design through close collaboration with development and product management.
Some key iterations included adding more guided tasks to showcase additional use cases, which expanded the product capabilities users could experience firsthand. We refined the task flow based on user analytics, making adjustments where we saw points of confusion or drop-off. The visual design elements were enhanced to improve clarity, ensuring users could easily understand their options and next steps.
A significant enhancement was the introduction of a navigation widget that displays bite-sized steps for completing each task. This feature includes a "show me" button that visually indicates exactly where users need to interact within the interface, reducing confusion and supporting learning.









During the iteration process, we discovered a technical constraint: when new content was added, all existing content would reset. During the update process, users will be unable to interact with the product. Rather than just displaying a loading spinner, I designed a new loading experience that highlights new features, creating a sense of anticipation and discovery for users returning to the platform. It provided helpful resources while content loads, turning what could be perceived as wasted time into a valuable learning opportunity. This approach reframes the reset as a chance to discover new capabilities rather than a frustrating interruption, enhancing the overall user journey while working within the technical constraints of the system.
During the iteration process, we discovered a technical constraint: when new content was added, all existing content would reset. During the update process, users will be unable to interact with the product. Rather than just displaying a loading spinner, I designed a new loading experience that highlights new features, creating a sense of anticipation and discovery for users returning to the platform. It provided helpful resources while content loads, turning what could be perceived as wasted time into a valuable learning opportunity. This approach reframes the reset as a chance to discover new capabilities rather than a frustrating interruption, enhancing the overall user journey while working within the technical constraints of the system.






Self-guided clickthrough demo
Building on the success of the sandbox trial, I collaborated with the marketing team to create a lightweight, self-guided demo that wouldn't require account creation. This experience was designed to provide a simplified flow demonstrating key Planning Analytics use cases, making it accessible to prospects earlier in their evaluation journey. We used strategically placed tooltips to guide user attention through the interface, mimicking the more robust sandbox experience but in a more streamlined format. The demo concludes with clear calls-to-action, inviting users to either book a personalized demo or create a sandbox account for deeper exploration.
Building on the success of the sandbox trial, I collaborated with the marketing team to create a lightweight, self-guided demo that wouldn't require account creation. This experience was designed to provide a simplified flow demonstrating key Planning Analytics use cases, making it accessible to prospects earlier in their evaluation journey. We used strategically placed tooltips to guide user attention through the interface, mimicking the more robust sandbox experience but in a more streamlined format. The demo concludes with clear calls-to-action, inviting users to either book a personalized demo or create a sandbox account for deeper exploration.
The partnership with marketing led to further refinements of the main sandbox experience, including a new onboarding flow with three paths. New users can choose between a self-guided experience within the sandbox trial or direct access to guided tasks based on their comfort level and learning preferences. Returning users receive a third option to explore independently with optional video support, acknowledging their existing familiarity with the platform while still offering assistance if needed. This multi-path approach ensures users at different stages of their journey receive an appropriate experience, maximizing engagement and conversion potential.
The partnership with marketing led to further refinements of the main sandbox experience, including a new onboarding flow with three paths. New users can choose between a self-guided experience within the sandbox trial or direct access to guided tasks based on their comfort level and learning preferences. Returning users receive a third option to explore independently with optional video support, acknowledging their existing familiarity with the platform while still offering assistance if needed. This multi-path approach ensures users at different stages of their journey receive an appropriate experience, maximizing engagement and conversion potential.















Results
The metrics from our redesign revealed a dramatic improvement in user engagement and business outcomes. In comparison with the previous trial experience, the redesigned trial experience delivered the following results:
The metrics from our redesign revealed a dramatic improvement in user engagement and business outcomes. To put our success in context, the previous trial experience had been severely underperforming. Only 3 users out of 1,769 converted in the past year, guided tour completion rates were very low at 0.003% (despite 50% of users starting the process), and only 14.5% of users (1,730 out of 11,894) completed the basic sign-up form.
Our redesigned trial experience delivered immediate results:
The metrics from our redesign revealed a dramatic improvement in user engagement and business outcomes. In comparison with the previous trial experience, the redesigned trial experience delivered the following results:
7
7
prospective clients demos in the first month
21.7%
21.7%
completion rate for at least 1 task
67%
67%
growth in trial sign up completion rate
Like what you see? Drop me a line!
Thanks for taking the time to check out my portfolio. I'm more than happy to answer any questions or discuss new opportunities so feel free to reach out to me any time
Get in touch
Like what you see? Drop me a line!
Thanks for taking the time to check out my portfolio. I'm more than happy to answer any questions or discuss new opportunities so feel free to reach out to me any time
Get in touch
Like what you see? Drop me a line!
Thanks for taking the time to check out my portfolio. I'm more than happy to answer any questions or discuss new opportunities so feel free to reach out to me any time
Get in touch