AI Nutrition Tracker

Client

Personal Project

Year

2024

Technical Stack

React NativeExpoTypescriptRevenueCatFirebaseAI SDK

Industry

Health

Scope of Work

DesignDevelopmentContinued support

I built an app that lets AI tell you what you're eating from just a photo. When I looked at existing nutrition trackers, I saw they all made you input ingredients one by one. I knew there had to be a better way. Now you just snap a picture - perfect for those times when you're enjoying a complex dish like Korean food and don't know half the ingredients.

First step: Validation

I spent one sunny Sunday building the first prototype. With that foundation in place, I could rapidly iterate on the UI until it felt natural and intuitive to use.

The big question was: would people actually use this? I believe in moving fast to validate ideas. So I launched it as a web app first, set up a small Google search campaign, and added Stripe subscriptions to test if people would pay for it.

Within days, we had our first paying subscribers. That was the green light I needed to move forward with the full vision.

A successful scan of a meal

Second step: Full launch

The App Store was calling. To get there, I tackled several key challenges:

  • Implementing a seamless App Store payment system
  • Building a unified payment system that works across web (Stripe) and iOS (Apple Pay), so subscribers could use both platforms
  • Polishing the UI to make it intuitive for everyone
  • Significantly enhancing the AI's accuracy and speed

I knocked these out in a few intense weeks of development, and we were live in the App Store.

A successful scan of a meal

Third step: Continued growth and support

Users started asking for text input alongside photos, and thanks to the flexible architecture I'd built, I could add this feature in just one day. That's the beauty of building things right from the start - you can move fast when opportunities arise.

Ready to get started?

Let's discuss your idea in a free discovery call.