Photo first℠
Realtor.com | 2019 | Mobile App Design, Artificial Intelligence
Overview
As a lead designer for the Lead Growth team, I delivered design work and strategy focusing on conversion, engagement and monetization growth.
Redesigning realtor.com’s photo experience was one of the team’s first projects. Started as a small feature update, it was turned to big opportunities, followed by multiple projects that benefit the business’ ecosystem. The features were later branded as Photo First by the company, as media reported about us being a market differentiator: a photo centric experience that uses artificial intelligence to provide relevance for home-shoppers in their searching.
Photo First was one of the company’s best performing features in the fiscal year. Here are a few key metrics:
60%+ unique users engaged with the new features
Lifted 20%+ lead generation
Lifted 20%+ unique users for existing photo features
Lifted 10%+ conversions from Search Result Page to Listing Detail Page
The project has gone through a few iterative phases:
Vertical photo gallery
Photo categorization by AI to enable personalized experience
More AI-powered features to facilitate user’s searching; opening up more project opportunities for future development
Background
Realtor.com connects home-shoppers and agents
As part of the business runs on lead generation, realtor.com delivers different experiences towards home-shoppers and agents. My team, on the consumer side, helps home-shoppers search and discover properties of interest; and then seamlessly connect with agents who will hand-hold through the home buying processes.
A new team formed to focus on lead growth & monetization
The company’s new structure has a few orgs, one of which focuses on the growth of the core business, where I took the lead for the design, and be the advocate for user.
Close collaborations across functions has led to the success of this team, including design, product, engineering, research, data analysis, data science and so on. We constantly validated through A/B testings, significantly lifted conversion and user engagement through a wide range of projects from small to large scale. Eventually we had the highest quarterly velocity among feature development teams, and our improved agile process set an example for other teams.
My team focused on mobile platform and there was an equivalent team for desktop. Following the “mobile first” principles, we were usually a step ahead for experiments, while the designers across teams were constantly communicating and collaborating.
from objective to SMALL feature
OKR: x% lead growth per quarter
Initially, it was the only objective for our team, derived from management.
On one hand, it’s clear what we are pursuing; on the other hand, it’s vague because it was not an immediate actionable item. PM and I partnered closely to figure out more concrete plans, one of which is defining the measurement of success.
Photo is the top used feature, but not the top lead driver.
We sat down with data analysts, looking through extensive metrics on all the functions in the app, and found that Photo is the most used feature, possibly because it is one of the initial access point for Home-shoppers to decide if he/she likes about a property listing before contacting agents. However, it didn’t generate the most leads as expected.
Why?
Our researchers were doing weekly benchmarking, they analyzed the apps in the vertical market such as Zillow, Trulia and Redfin. I also looked at different market’s apps handling photos, such as Yelp, Google Photos, Instagram and so on.
Realtor.com is strong at the listing data. We should have better (fast and accurate) information than what user could find in anywhere else, unless we don’t show it, or user couldn’t find it.
The existing photo gallery is not efficient any more for today
The following video shows our old photo experience: Home-shoppers come to the Listing Detail Page (LDP), they tap once and directly go into the immersive full screen photo gallery, they swipe one by one the photos and are able to zoom in and zoom out, etc. They used to appreciate its simpleness and fastness.
However, in recent years, massive photos are uploaded. Out of the top 25% of listings, the average number of photos per listing is 47. I have seen a single listing has 680 photos. It’s nearly impossible for Home-shopper to find a particular kitchen photo among hundreds. Most user will swipe a few times, and then left.
With two goals in mind, we had a sketching section for solutions
Help home-shoppers find the right photo more efficiently
Continue delighting home-shoppers with immersive photo experience
There were concepts as simple as adding a layer of vertical gallery, and also as complex as totally redesigning the Listing Detail Page. PM and I looked at the estimated efforts and impact to make decisions on what concepts to pursue.
Some concepts did not make to the final based on user testing results
I created interactive prototypes for a few concepts, so our researcher took them for user testing.
One of examples is showed below: the redesigned default view of Listing Detail Page would land user on the photo gallery by simply scrolling. User would naturally immerse in the listing’s photographic story right away.
But we didn’t pursue this concept because user felt more confused and difficult to see key information right below the fold. Other concepts performed better in solving the identified usability issue.
Phase 1 solution: vertical photo gallery
I added a vertical photo gallery between Listing Detail Page (LDP) and the full screen mode, which has already been proven to be very efficient by lots of systems and apps. User can quickly skim through by vertical scrolling. Photos are kept in original ratios so contents are not cropped, while user has an option for grid view to see more photos on one screen. Full screen mode is accessible and intuitive when needed, only one tap away.
From a business objective, we’ve narrowed down to a usability issue that we have an actionable solution for.
Remarkable results from A/B testings exceeding our expectation prove our hypothesis and direction.
From small feature to bigger picture
“But is this good enough for us to meet the ambitious goal?” Even though we had achieved great numbers, I couldn’t help but think. I attempted to look at the bigger picture again.
“What is the real problem?”
We had personas and extensive user journeys. They are very helpful for answering the question. I took out part of the user flow and deconstructed it, which helped us understand that it could possibly go wrong in almost every steps.
For example, it could be: the hero image does not interest the user; gallery is hard to use; repetitive photos push user away; difficult to locate photos of the kitchen and bedroom; or when there is something missing either in photos or property facts, going back and forth takes a lot of effort and user forgot where they were, etc.
Putting together the flows helped us see why the experience is broken
When realtor.com got the data, we didn’t do enough optimization, most of the time we show it as it is. That’s why the heavy-lifting is on user’s side. The usability issue we identified earlier was the surface of the systematic problem. Those gaps are the root of the user’s need, if we can fill the gap, we can dramatically enhance the user’s engagement with us.
Updated problem: the system fails to deliver information relevant to home-shoppers’ interests timely, which stops them from moving forward.
In order to find a better solution, there were some key questions I asked. For example these HOW MIGHT WE questions are closely targeting at the root problem, also taking our long-term ecosystem into account.
Ideas supported by machine learning
I found artificial intelligence could be a great answer to the HOW MIGHT WE questions. So I sat down with data scientist to evaluate some ideas I had. They confirmed, after doing a spike, that “with our in house deep learning model, AI could recognize photos with accuracy over 97%.”
What does that mean? It means that we can do various things with a technique called Advanced Image Tagging, for example we can:
Breakdown photos from raw data into categories
Connect photos with property facts, such as style or materials etc.
Provide personalized content and better recommendation for listings,
Enable searching through photos
The tags could be unlimited, but they were carefully optimized and prioritized by us based on domain knowledge, because they could mean more or less important for user. For example, to a home-shopper, the presence of a dinning table in the photo is not as significant as the presence of a pool.
Phase 2 solution: photo category with AI
The photo categories are right away in the listing detail page, also in the photo gallery. As user continue use the app, they could be personalized. If kitchen is important for the user, then they will see the kitchen first.
Phase 3 solution: image/text mutual understanding
It is commonly seen that Home-shopper forgets about the listing’s key facts (e.g. room size, yard size, floor material etc.), while he/she enjoys the immersive moment with photos. With the same AI technique, we can help user connect the dots.
This design solution helps user quickly learns about the textual facts of the property from the photos they are looking at. For example, user can see right away the kitchen is 200 sqft, it has an electric oven, and a marble countertop.
Meanwhile, I took the chance to also launch the redesign of Property Details. The long detail page has not been touched for years and was difficult to search information. With Photo First, user now can seamlessly navigate between photos and property information.
Interactive Prototype
I created this interactive prototype for various purposes from quick testing to development specs.
Visions after phase 3
There were multiple concepts I created for future product visions, here’s one of them. When home shopper continues engaging with the app, the AI will show a match score and highlight what they care the most for each of the homes, so to help them make better decisions.
Practical consideration
We have immediate action items, and we have visions. While the AI integrated photo features are differentiators in the market, we will want to make them as good as possible when released. But that means more time for development. Remember we have the quarterly OKR, limited time and resources. With agile development, we don’t have a restrict plan for 6 months. So we need some practical consideration to avoid over engineering.
Is delivering part of the work still providing value to our user?
In the case of this project, yes.
But I want to bring up a practice process from our design team: GOOD BETTER BEST. They are NOT necessarily referring to the quality of the final design, but could refer to what could be accomplished in a practical environment. To be more specific, good is low hanging fruit, could be done within 1-2 sprints. Better is the sweet point, usually has really good cost/performance ratio. Best is the new vision.
When applied to this particular project, the vertical gallery is the Good; the categories and image/text mutual understanding are the Better; lastly the match score is the Best. As time goes by, the Good will be done, the Better will become the Good, and the Best will become the Better.
Key result
We contributed 76% of company’s quarterly lead growth, along with lifting 30% for most user engagement metrics.
The Good design gave us lifted 21% leads, making the photo become the most used feature AND the best lead generator throughout the app. As for the Better design, it gave us 60% users engaging with it, and 23% leads lifted.
These outstanding numbers have made the feature the best performing one for the company, thus it was rebranded as Photo First and allowed for more resources for future developments.
Framework & Methods
Here illustrates a summary of my framework that guided my design projects, also incorporated with the Good Better Best practice. In each stage, there are methods depending on the situations that could help reach better design decisions.