Hatch: Startup Simulation
The Design/Business Challenge
Simulate a startup within the existing markets, applying interaction design skills to create not only a product but also a business.
It is a 12-week group project in the Creative Founder course taught by Kate Rutter.
Team: Nathalia Kasman (CEO), Claire Zhou (CMO), Leo Zhang (CFO)
Methodology: Steve Blank’s Customer Development process, Lean Start-up, Quantitative research, Market research, Prototyping, Usability testing
The Project Overview
When students learn new things, solving problems alone isn’t the most easy thing to do. However, it’s also hard to ask for help because:
The person who can help you might be outside of your social circle
You might be hesitant in asking for help so as not to appear burdensome
Hatch, a company that provides a peer-to-peer skill sharing app, helps students to:
Expand student’s connections
Create mutual learning experiences where peers can help one another
Students will create a profile that provides information about what they want to learn and share, as well as specific learning goals. We will provide prompts to guide them throughout the process.
Based on their profile, we will suggest to him or her co-learning peers who have complementary interests and goals, as well as those who attend the same school as them. They tap “connect“ if they think the suggested peer is a good fit.
Connect with peers who have similar learning interests and goals.
Once both sides think they are suitable peers to one another, they can start a chat room. In the chat room, we will provide conversation guides to help them break the ice.
The Endless Pivots Journey
Different from regular design courses, the Creative Founder course is a start-up class, which means that everything needed to be tested by market. Hence, we tried to validate the problem by talking to lots of our target users and going back to rethink ideas if the problem isn’t validated.
In the first part of the class, we focused on developing products for illustrators but the problems we targeted either weren’t real (not validated by users), or were caused by a big systematic issue that we weren’t familiar of.
It was painful, but after endless pivots, we decided to start everything all over again.
Research of students’ learning behaviors
Since we needed to catch up with the class, we decided to focus on the problem we all experienced ourselves as students— we had few effective resources when learning new things. Then, we did several activities to understand current students’ learning behaviors.
Firstly, we posted a huge poster around campus. We encouraged people to respond to the poster by writing down their most efficient way to learn a new skill. From the responses, we knew that asking someone who know the skill is the most efficient way to learn.
Then, we interviewed 8 students to further understand their thoughts and behaviors of helping their peers or asking help from them.
The key insights we learned are:
Helping others is also a form of learning.
Few students said that they get a recap of what they learned and understand the material better by helping their friends.
There is hesitation in asking for help.
People don’t ask help because they don’t want to appear burdensome and will only ask for help when friends who can help them are physically near them.
As our insights told us that students usually learned from peers to solve problems quicker, we thought that we could create a digital site/app for students to share their skills with each other.
Once we decided to make a skill sharing app, we did research on many other learning platforms in the marketplace that range from tutoring to learning exchange apps. We want to foster face to face interactions and mutual learning experiences, because based on our interviews, people are more open to seek help (and to also provide help) when they are physically close to each other.
To validate our concept, we wanted to know how many students would be interested to join our peer sharing community because more sign-ups would be an indication that there is a need that we can fulfill.
We recruited students through:
On site recruitment: we set up a table to encourage students to fill out what they want to learn and share on our sign-up forms.
Sign-up boxes: we put sign-up boxes around campus so students could fill out sign-up forms and left them in the boxes as they passed by them.
Offline marketing: we posted posters, flyers, and stickers around the campus. Students can scan the QR Code and fill out our online sign-up form.
Online marketing: we posted our posters and latest activities on Instagram and Facebook, as well as established our landing page.
We got 117 QR code scans and 81 sign-ups in only 3 weeks!
As we successfully recruited participants, we prototyped the learning exchange experience by using the wizard of Oz method — pairing students up manually by their complementary interests to pretend we were the algorithm of our app. In two weeks, we ran 12 experiences (6 pairs of students) in total.
In week 1, Before we ran the experience, we listed down the assumptions that we wanted to test.
Then we contacted students via emails and set up a 30-minute meeting session for them based on their schedule. While both students were met, we would introduced them and left room for them to chat. After they were done, we would interview them respectly to learn their experience.
We noticed that each group has different types of learning exchanges:
Students who already have specific questions in mind and just want answers.
Students who used the session to share their learning experiences and sharing best practices.
Students who shared about their own individual practices and were inspired by one another’s processes and methods.
No matter which type of learning exchange each group has, participants all thought that it was inspring to learn about other people’s way of working, which proved that our assumptions were true. Moreover, we found our more insights that we haven't thought of.
Based on the insights we got from week 1, we formed more assumptions and designed multiple versions of experiences to test them in week 2.
Experience 1.0: Specific info
The first insight we had from week 1 was that students wanted to know more specific information about their peers before meetings. Hence, we sent participants surveys for them to fill out more specific information (learning motivation, goals, background, etc) then forwarded it to their assigned peer so they could know their peer more prior to the meetings.
It turned out our assumption was true — the participants thought knowing more information about their peers beforehand make meetings become more efficient.
Experience 2.0: Self Browse
The second insight we had was that it was hard to find peers that had similar learning goals as them. Hence, we not only did the survey exchange (same as experience version 1), but also sent them an excel sheet to ask them to pick 3 ideal peers from the 10 possible peers we listed.
It turned out our assumption was only partially true— the participants thought self browse was“interesting, but is not necessary”, “picking suitable peers by myself might cost too much time and mental efforts”.
Experience version 3: Customization+conversation guide
The last insight we had was that we needed to help students be more comfortable during initial interactions. Hence, rather than scheduling locations and time for them, we put them into a group chat for them to discuss their meeting experience on their own and we pretended to be the robot that provided conversation guide as an ice breaker.
It turned out our assumption was true — the participants thought it helped them found more common topics with their peers.
To develop our digital MVP, I translated our successful experiences into 3 main features:
Moreover, I added a feedback section for user retention so the data could help us provide users more suitable peers next time.
At first, the feedback system I used was rating star and endorsement. But from user testing, I learned from participants that it wasn’t the best fit in this scenario both logically and morally — “While someone voluntary helping me, I didn’t want to rate or endorse them.”
As a result, rather than transferring the experience to any kind of objective system, I kept it more simple and more personal — users could write down their feedback and choose whether if they want to send it to their peers.
I mapped out the user flow which includes 3 key features we developed from the experiences — profile, suggested peers, and chatroom, as well as feedback section to retain users.
The Pitch Day
At the end of the course, we were asked to pitch our idea as if we were pitching for funding. We had an ask of $520,000 to start our product and achieve our next milestones and pitched this to a group of 11 venture capitalists. At the end, we got 9/11 venture capitalists’ funding, totaling the investment to $1,681,500!
What is the difference between digital and analog products, and how can they be partnered with schools?
When we were doing experiments, we discovered that on-site recruiting and activities were way more effective than when done online. Based on this insight, we first thought to make Hatch as a school service or program. However, this scenario limits the numbers of people we are able to reach. Also, even though our initial idea was to make Hatch a free product that partnered with schools, we ended up adopting premium models where users bare the cost, since it is difficult to persuade schools to adopt a brand new product. If we have more time, we would like to rethink our financial models and business strategies.
What is the next step for Hatch?
Our goal is to make Hatch really happen in school. Due to time and financial constraints, we aren’t able to create an APP, but we are discussing it with our school learning resources centers, libraries, and student groups. We continue doing lots of physical testing and experiments in schools, including weekly meetups, drop-in tutors, etc. Hopefully, Hatch can successfully launch in our school and be an ongoing program that exists even after we graduate.
Listen to the market
Working on Hatch was a very unique experience since it was the first project where I designed a product in a real business context, which means that every design decision should be tested by the real world. At first, it was hard to let go of the designer’s ego, pull back, and leave everything up to the market. We were trained to be interaction designers who are just focused on creating complete user experiences. However, in this scenario, the product was just a tool for validating and iterating upon our concept.
While following the UX lean cycle: learn, measure, and build, I had an opportunity to have a series of real human interactions. Every week, we interviewed people and gathered insights. Based on those insights, we came up with solutions and turned them into experiments. Then, we ran those experiments the following week to iterate upon our concept. At the end, we talked to more than 60 people, conducted 12 experiments and 8 usability tests. It was both physically and mentally tough to interact with people in such an intense frequency, but it helped us develop a real product that can solve real problems. Additionally, the most fulfilling part is that I actually executed my power to help students enhance their learning experiences and witnessed the change myself by organizing skill exchange meetings and heard the feedback from participants.
As the experience taught me how to listen to the market (instead of just the product itself), it not only helped me become a better interaction designer, but it also gave me a broader life perspective — knowing how big the market is, and how big the world is outside of certain design fields.