Norvana. Personal health intelligence
UK 2024-2025 · AI STRATEGY, FUNCTIONAL PROTOTYPING, MULTIMODAL UX, CONTEXT ENGINEERING, AND GENERATIVE UI
A working AI health companion — from raw vision to functional iOS prototype in twelve weeks. Voice-first onboarding, personalized health reports, nutrition analysis with follow-up guidance, AI-generated workout illustrations. Built entirely by designers in code. The prototype became Norvana's fundraising demo — investors and partners experienced the product instead of reading a pitch deck.
70,000+
Users accessible through pre-launch UK partnership
Government trials
Malta exploring nationwide clinical rollout
HEALTH, TODAY, FOCUS
Three tabs. That's the entire information architecture.
Most health apps add a tab for every feature. We reduced the entire experience to three views because that maps to how people actually think about their health: where do I stand, what should I do today, what am I working towards.
The "Ask Norvana" bar persists on every screen — not a separate chat tab, but a layer that knows which view you're on and what data is relevant. The whole app is a conversation, but not a chatbot.
AI generates every card, every insight, every recommendation. The chat bar is just where you talk back.
AI INTERACTION LAYER
The bar at the bottom of every screen is the product. Not a feature. Every interaction — text, voice, camera — runs through a single multimodal input. A grid icon opens quick actions for things faster as a tap than a typed command.
AI decides the response format. Sometimes text. Sometimes selectable options. Scan chicken and it asks back: chicken or turkey? The bar doesn't just accept input. It shapes it.
Health apps love scores. 87 out of 100. Four stars. Users chase the number instead of understanding their health. We dropped scores entirely. Word-based status labels instead — "Optimized," "Needs attention."
The health report is another form of AI speaking to you — generated from your biomarkers, Apple HealthKit data, and conversation history. Not a static dashboard. A personalized analysis that highlights what matters, explains possible causes, and lets you act. "Add to Focus" turns any insight into a goal — one tap updates your context model and recalibrates every recommendation the AI gives from that point on.
PERSONALIZED NUTRITION ANALYSIS
We designed Norvana’s nutrition feature as an interactive consultation, not a calorie counter.
Users can analyse a meal and instantly see its breakdown: calories, protein, fats, carbohydrates, and other nutrients. But the experience doesn’t stop at numbers.
The interface encourages curiosity. Norvana explains what nutrients means for your body and invites you to ask follow-up questions about its impact — turning raw nutritional data into informed, personalised guidance.
It’s not just about tracking what you eat. It’s about understanding how it affects you.
We designed Food Scan as a fast, low-friction way to understand what’s actually on your plate.
Instead of manually logging ingredients, users simply scan their meal. The system analyses the composition and presents a clear nutritional breakdown in seconds.
But it goes beyond numbers. Food Scan also highlights imbalances and suggests small adjustments: adding more protein, reducing sodium, or improving portion ratios.
The goal wasn’t just recognition — it was optimisation. A simple scan becomes a moment of insight, helping users make smarter decisions without turning every meal into a spreadsheet.
PERSONALIZED WORKOUT PLANNING
Workouts are generated weekly from your full context — fitness level, goals, body data, preferences. Preferences shift. Your energy changes. Something hurts.
Open a session, say "my knee hurts," and the AI asks follow-up questions and rewrites that specific training. Same context layer that powers nutrition and health reports — one sentence in the chat adjusts the whole plan.
AI-GENERATED VISUALS
Every image in the app is AI-generated. Food images worked well out of the box. Exercise images didn't.
Every workout image is generated from your profile — age, weight, fitness level, health conditions, goals. Figures are headless, cut at the neck, so you see yourself. Body type rendered slightly above your current level. A future version of you.
Raw workout plans fed directly to an image model produced heads rotated 180°, barbells growing from necks.

The fix wasn't better prompts. We added an "art director" agent — one model generates the plan, a second describes the exercise in anatomical detail: body position, limb angles, equipment dimensions.
That feeds the image model. Error rate dropped from 4-of-7 broken to 1-of-7. These were generated using first-generation Gemini image models.
By the time you're reading this, they're likely already outdated and this approach is not needed anymore:)
DAILY HEALTH GUIDANCE
No fixed targets. Every metric — sleep, activity, nutrition — is shown as a personalized range. Your optimal window, based on your data and grounded science. Not a single number to hit. A zone to stay in.
Each insight invites you deeper. Suggested follow-ups — inspired by how Perplexity handles exploration — let you keep asking about what you're seeing. Why is my core sleep low? What affects fall asleep time? The app doesn't just present data. It teaches you your own body, one question at a time.
We designed Norvana to work through the day with you — not as a tool you open, but as a companion that reflects your day as it happens.
Morning surfaces your plans and fresh data. As you move, eat, and check in, the timeline builds. Evening shows what you accomplished and what shifted.
We rejected streaks, badges, and gamification. Health is a long game. Retention comes from value, not dopamine.
Every daily suggestion is generated from your full context — not pulled from a tips database. The same recommendation never appears for two different users.
