[00:00] (0.16s)
We have AI now and everyone thinks
[00:02] (2.24s)
building apps is suddenly easy. But
[00:04] (4.16s)
here's the truth. Most apps built with
[00:06] (6.08s)
AI will never see a single download. And
[00:08] (8.56s)
the few that do, they'll be deleted
[00:10] (10.16s)
within days because they're missing
[00:11] (11.76s)
something crucial. The difference
[00:13] (13.20s)
between apps that fail and apps that
[00:15] (15.20s)
succeed isn't about features or fancy
[00:17] (17.36s)
designs. It's about understanding the
[00:19] (19.04s)
psychology that makes users open your
[00:21] (21.12s)
app 10 times a day without even thinking
[00:23] (23.12s)
about it. That's what I'm going to show
[00:24] (24.64s)
you in this video. the exact framework
[00:26] (26.48s)
for building apps that tap into what
[00:28] (28.40s)
users actually crave and turn casual
[00:30] (30.72s)
users into people who can't imagine
[00:32] (32.72s)
their day without your app. First, you
[00:35] (35.04s)
need to decide what you're going to use
[00:36] (36.56s)
to build your app. There are many AI
[00:38] (38.40s)
builders out there. There's Lovable,
[00:40] (40.08s)
which builds whole apps for you. And
[00:41] (41.84s)
then there are idees like Cursor and
[00:43] (43.84s)
Windsurf. You also have AI agents like
[00:46] (46.16s)
Claude Code and the Gemini CLI. In my
[00:48] (48.88s)
opinion, the best coding tool you can
[00:50] (50.80s)
use is cursor AI. The reason is it gives
[00:53] (53.20s)
you way more control than web-based
[00:55] (55.04s)
builders like Lovable or Bold. New while
[00:57] (57.44s)
being much more beginnerfriendly than
[00:59] (59.20s)
Claude Code or Gemini CLI. Getting
[01:01] (61.68s)
started is really easy. Installation is
[01:03] (63.92s)
simple and when you open it up, you'll
[01:05] (65.76s)
see this layout with your files on the
[01:07] (67.60s)
right. On this side, you'll see a chat
[01:09] (69.52s)
box and this chat box is what you use to
[01:12] (72.00s)
code with AI. After that, you're going
[01:14] (74.24s)
to hop into Claude or ChatgPT. The free
[01:16] (76.72s)
version is enough. You need to describe
[01:18] (78.56s)
your app in plain and simple words. And
[01:20] (80.72s)
then this is the important part. You
[01:22] (82.56s)
need to ask for the user journey because
[01:24] (84.56s)
we're going to build our app based on
[01:26] (86.32s)
that. Once we have the basic user
[01:28] (88.32s)
journey, we'll improve the experience
[01:30] (90.16s)
bit by bit from there. I'm building a
[01:32] (92.32s)
photo cleaner app. I've seen a lot of
[01:34] (94.08s)
apps like this blow up on the iOS app
[01:36] (96.32s)
store. But the main thing I noticed is
[01:38] (98.08s)
that the user experience they offer is
[01:40] (100.16s)
really bad. And this is another
[01:41] (101.60s)
important point. Ideas don't really
[01:43] (103.60s)
matter anymore. What matters more is how
[01:45] (105.84s)
you execute them. Even if you have the
[01:48] (108.00s)
same idea, you can easily convert users
[01:50] (110.48s)
or get them to switch over if you're
[01:52] (112.40s)
giving them a better experience. This
[01:54] (114.24s)
photo cleaner app lets you clean your
[01:56] (116.08s)
photos by swiping left or right. It's a
[01:58] (118.48s)
very simple concept. Now, Claude gave me
[02:00] (120.72s)
the user journey. It's up to you if you
[02:02] (122.72s)
want to fully accept it, but I always
[02:04] (124.64s)
like to read through it and make
[02:06] (126.24s)
changes. For example, it grouped the
[02:08] (128.48s)
photos based on where they came from,
[02:10] (130.32s)
and I told it I wanted them grouped
[02:12] (132.32s)
according to months. So, users can clean
[02:14] (134.56s)
out their photos month by month. and
[02:16] (136.56s)
actually feel some sense of achievement.
[02:18] (138.64s)
After that, it gave me the improved
[02:20] (140.56s)
version. I hopped back into cursor and
[02:22] (142.72s)
made a new file named ux.md. The reason
[02:25] (145.92s)
for this is because I need to make sure
[02:27] (147.68s)
my user journey is saved and I can
[02:29] (149.76s)
reference it whenever I want to. You
[02:31] (151.60s)
also need to know that cursor doesn't
[02:33] (153.44s)
remember anything. If I open a new chat,
[02:35] (155.76s)
it won't remember what my app was.
[02:37] (157.60s)
Important bits of information need to be
[02:39] (159.60s)
saved at all times. This is context
[02:41] (161.68s)
engineering, the new version of prompt
[02:43] (163.60s)
engineering. In that file, I pasted the
[02:45] (165.92s)
full user journey that Claude gave me.
[02:48] (168.00s)
If I scroll up, you'll see what I asked
[02:50] (170.08s)
cursor to build. I told it I needed to
[02:52] (172.24s)
build a mobile app prototype. And again,
[02:54] (174.56s)
this is another important point. You
[02:56] (176.40s)
need to have it built in HTML. Making
[02:58] (178.72s)
changes in HTML, whether it's design or
[03:01] (181.28s)
structure, is way easier compared to
[03:03] (183.36s)
doing the same in iOS apps. This right
[03:05] (185.44s)
here is an iOS app template, and it's
[03:07] (187.68s)
completely empty right now. If I open up
[03:09] (189.84s)
the simulator, you'll see that it's
[03:11] (191.44s)
blank. I'll be filling it up later with
[03:13] (193.28s)
my actual app. But right now, it's in
[03:15] (195.36s)
HTML format. After that, I told it that
[03:17] (197.84s)
the structure and user experience should
[03:20] (200.16s)
be built exactly according to the UX
[03:22] (202.32s)
journey I had given. It got to work and
[03:24] (204.56s)
built the app according to that user
[03:26] (206.40s)
journey. And if I show you the app, this
[03:28] (208.48s)
is what it created. It's really basic
[03:30] (210.56s)
right now, but it allows me to make
[03:32] (212.32s)
changes however I want. For example, if
[03:34] (214.56s)
I want the cards to be bigger or have
[03:36] (216.40s)
one card in each row instead of two, I
[03:38] (218.64s)
can do that easily. When I open this up,
[03:40] (220.48s)
you can see that if I swipe right, the
[03:42] (222.32s)
photo gets saved. If I swipe left, it
[03:44] (224.56s)
gets deleted. It has made some weird
[03:46] (226.64s)
icon choices, but that's not a big
[03:48] (228.32s)
issue. We can easily fix them. You can
[03:50] (230.40s)
see we've got an undo button, and we're
[03:52] (232.32s)
also tracking how many photos we've
[03:54] (234.08s)
changed. Pretty basic app structure, but
[03:56] (236.16s)
this is exactly what we need. We're
[03:57] (237.84s)
going to make a few changes, then
[03:59] (239.44s)
convert it into an iOS app, and move on
[04:01] (241.60s)
from there. Over on the AI Labs Discord
[04:04] (244.16s)
community, we're hosting our first ever
[04:06] (246.00s)
hackathon from July 22nd to July 28th.
[04:09] (249.20s)
Submit your most interesting builds and
[04:11] (251.04s)
projects, and the top five submissions
[04:12] (252.80s)
will be featured in one of our YouTube
[04:14] (254.48s)
videos. You can join by clicking the
[04:16] (256.32s)
link in the pinned comment below. And if
[04:18] (258.24s)
you're enjoying the content so far, make
[04:20] (260.16s)
sure to hit that subscribe button so you
[04:22] (262.24s)
don't miss what's coming next. So,
[04:24] (264.16s)
whatever structure it's made, the first
[04:26] (266.00s)
thing I do is implement a color palette.
[04:28] (268.32s)
For that, I ask it to remove all the
[04:30] (270.24s)
existing colors and emojis from the app.
[04:32] (272.32s)
Once that's done, I tell it to identify
[04:34] (274.40s)
the elements inside the prototype,
[04:36] (276.40s)
things like cards, buttons, and so on.
[04:38] (278.64s)
And then I ask it to map a color palette
[04:40] (280.88s)
to those elements for a consistent look.
[04:43] (283.04s)
I also tell it to save that in a color
[04:45] (285.12s)
scheme.md file so it doesn't forget the
[04:47] (287.68s)
mapping. To get the color palette, you
[04:49] (289.60s)
can go to this site called colors. I
[04:51] (291.68s)
leave it in the description below. You
[04:53] (293.52s)
press spacebar again and again and it
[04:55] (295.84s)
keeps generating different color
[04:57] (297.36s)
palettes. Once you find one you like,
[04:59] (299.52s)
you can export it. Grab the CSS and copy
[05:02] (302.24s)
it. Give that to cursor and it will
[05:04] (304.16s)
create this color scheme mapping and
[05:05] (305.84s)
save it in the file. Once that was done,
[05:07] (307.60s)
I told it to implement the scheme on the
[05:09] (309.60s)
HTML prototype. Let me show you what the
[05:11] (311.84s)
prototype looks like now. This is what
[05:13] (313.68s)
it's looking like. Much better and
[05:15] (315.20s)
minimal. The style looks really better.
[05:17] (317.28s)
I'm still not fully happy with the
[05:18] (318.88s)
design, but we can change that later on.
[05:20] (320.88s)
You can see when I swipe this way, we
[05:22] (322.64s)
get this red color. When I swipe the
[05:24] (324.48s)
other way, we get this blue color. And
[05:26] (326.24s)
it's all implemented according to the
[05:28] (328.16s)
color palette. So overall, it has a
[05:30] (330.40s)
simple coherent look and it's much
[05:32] (332.24s)
better than before. Then another really
[05:34] (334.24s)
important part of crafting a good
[05:35] (335.84s)
experience for your users, we come to
[05:37] (337.68s)
the fonts. Choosing a coherent font is
[05:39] (339.84s)
super important. For this, you can head
[05:41] (341.76s)
over to Google Fonts. They have an
[05:43] (343.52s)
amazing collection of fonts you can
[05:45] (345.20s)
choose from. They've got playful fonts,
[05:47] (347.12s)
modern fonts, every type you could want.
[05:49] (349.28s)
So you find one that you like. For
[05:51] (351.12s)
example, I wanted to implement this font
[05:53] (353.36s)
right here. To me, it looks really nice,
[05:55] (355.36s)
much better than the default font. So I
[05:57] (357.36s)
click get font, then grab the embed
[05:59] (359.44s)
code. I choose the HTML one. Go back and
[06:02] (362.16s)
paste it in. And if I go back and
[06:04] (364.08s)
refresh the app, you can see that now
[06:05] (365.92s)
it's actually using that font. You might
[06:07] (367.84s)
not notice it at first, but if you look
[06:09] (369.52s)
really close, you can see that the new
[06:11] (371.20s)
font is now being used. Another thing
[06:13] (373.36s)
that's really important for your user
[06:15] (375.28s)
experience is your app icon. You don't
[06:17] (377.60s)
need to go through designers. You can
[06:19] (379.52s)
just use the chat GPT4 image model,
[06:22] (382.08s)
which is actually pretty great. Now, I
[06:23] (383.92s)
described that I wanted a cartoonish
[06:26] (386.00s)
icon of a leaf since that's the name of
[06:27] (387.92s)
the app I'm making, and mentioned it's
[06:29] (389.52s)
for an iOS app. It gave me this icon,
[06:31] (391.84s)
but it didn't match the color palette I
[06:33] (393.92s)
was using inside the app. So, I gave it
[06:36] (396.16s)
some additional instructions along with
[06:38] (398.00s)
the color palette, and it created this
[06:40] (400.16s)
icon that actually matches the colors
[06:42] (402.16s)
used in the app. Now, the app was still
[06:44] (404.24s)
in HTML, not a proper app yet. So, we
[06:46] (406.80s)
needed to convert that HTML into a full
[06:49] (409.04s)
Swift app. I came back to cursor and
[06:51] (411.04s)
told it I needed to convert the HTML app
[06:53] (413.28s)
into an iOS swift app. It gave me some
[06:55] (415.44s)
conversion rules inside a file called
[06:57] (417.52s)
conversion.md. Another thing is it
[06:59] (419.84s)
needed to use the context 7 MCP. If you
[07:02] (422.64s)
don't know what MCPS are, they're
[07:04] (424.56s)
essentially a way to give your AI model
[07:06] (426.56s)
some extra context or information.
[07:08] (428.56s)
Context 7 MCP is really solid. It has
[07:11] (431.28s)
documentation for a bunch of different
[07:13] (433.12s)
frameworks and languages. It uses vector
[07:15] (435.52s)
search, which means the AI model can
[07:17] (437.60s)
pull in the relevant info it needs much
[07:19] (439.76s)
more efficiently. So, it called the MCP
[07:22] (442.00s)
tool and got all the important
[07:23] (443.68s)
conversion details. If you don't know
[07:25] (445.44s)
how to add the MCP to cursor, I'll link
[07:27] (447.68s)
a video below. It wrote everything down
[07:29] (449.84s)
in conversion.md and listed out a lot of
[07:32] (452.64s)
points to properly convert the app.
[07:34] (454.72s)
After this, you're going to open a new
[07:36] (456.56s)
chat because when you open a new chat in
[07:38] (458.80s)
cursor, it forgets what you were talking
[07:40] (460.80s)
about. That's actually a good thing
[07:42] (462.48s)
because it can approach things more
[07:44] (464.16s)
cleanly and often perform better. Once
[07:46] (466.40s)
you've got that new chat open, you tell
[07:48] (468.24s)
it to convert the HTML app into a Swift
[07:50] (470.80s)
app inside the test app we created and
[07:53] (473.12s)
to use the conversion file. After that,
[07:55] (475.12s)
we see another cool cursor feature,
[07:57] (477.12s)
to-do lists. It sets up a to-do list
[07:59] (479.12s)
with 15 items and then one by one, it
[08:01] (481.60s)
goes ahead and edits the files. Once
[08:03] (483.52s)
that's done, it runs the app a few
[08:05] (485.44s)
times. It failed a couple of times, but
[08:07] (487.60s)
that's normal. After running the app and
[08:09] (489.60s)
seeing the errors, it fixed them
[08:11] (491.20s)
automatically. Once that was done, it
[08:13] (493.20s)
informed us that the app has now been
[08:15] (495.04s)
successfully converted and we can go
[08:16] (496.80s)
ahead and open it up. Okay, so you can
[08:18] (498.96s)
clearly see that we have the app running
[08:20] (500.96s)
in our iOS simulator and it's a fully
[08:23] (503.20s)
usable app right now. You can see that
[08:25] (505.12s)
we're going through the pages and I
[08:26] (506.88s)
already gave it the permissions. So now
[08:28] (508.64s)
it has access to the photos. But this is
[08:30] (510.72s)
the basic view that we have. It's still
[08:32] (512.56s)
a little janky right now. We're getting
[08:34] (514.64s)
small errors like views overlapping, but
[08:37] (517.04s)
that's bound to happen and it's
[08:38] (518.48s)
something that'll get fixed gradually.
[08:40] (520.48s)
You just have to focus on each page or
[08:42] (522.88s)
in iOS apps as they call it, each view
[08:45] (525.52s)
one by one. After your iOS app is
[08:48] (528.00s)
functional and running, you need to
[08:49] (529.60s)
focus on the animations. For that,
[08:51] (531.44s)
you're going to copy your UX.md, the one
[08:53] (533.84s)
you got from Claude, and go back to it.
[08:55] (535.68s)
You're going to explain to Claude that
[08:57] (537.36s)
this is the user journey, and now you
[08:59] (539.20s)
need to focus on the animations. Give it
[09:01] (541.20s)
the first section and tell it to
[09:02] (542.88s)
describe how the animations will look
[09:04] (544.72s)
visually. You don't want the
[09:06] (546.00s)
specifications like timing or
[09:07] (547.92s)
directions. It just needs to describe
[09:09] (549.92s)
them visually. Then go back into cursor
[09:12] (552.16s)
and create an animations guide file. In
[09:14] (554.48s)
this animations guide, you'll place each
[09:16] (556.56s)
section or page as you get them from
[09:18] (558.40s)
Claude. For example, the welcome to leaf
[09:20] (560.72s)
screen goes here. Then you have the
[09:22] (562.40s)
permission request screen and that gets
[09:24] (564.32s)
its own animation description. Once you
[09:26] (566.32s)
list them down, go into cursor and tell
[09:28] (568.48s)
it you want to implement animations. It
[09:30] (570.64s)
needs to look at the descriptions and
[09:32] (572.24s)
break them down into individual pieces.
[09:34] (574.40s)
Here's the most important part. Cursor
[09:36] (576.24s)
doesn't really know how to implement
[09:37] (577.84s)
animations in Swift UI or in any other
[09:40] (580.24s)
framework. Even if it tries, the results
[09:42] (582.24s)
will often be janky and not polished.
[09:44] (584.48s)
But with the context 7 MCP, it goes into
[09:47] (587.20s)
the Swift UI documentation and finds the
[09:49] (589.84s)
proper recommended ways to implement
[09:51] (591.84s)
those animations. Then it writes out the
[09:53] (593.84s)
full implementation inside the
[09:55] (595.60s)
animations guide. The welcome screen
[09:57] (597.36s)
animation has its own section and each
[09:59] (599.36s)
animation has a complete implementation
[10:01] (601.44s)
written down. Now cursor doesn't have to
[10:03] (603.44s)
think. It just follows that exact
[10:05] (605.28s)
implementation and adds it into the iOS
[10:07] (607.68s)
app. If I go back to the app and restart
[10:09] (609.60s)
it, you'll see we now get this amazing
[10:11] (611.52s)
animation exactly how Claude described
[10:13] (613.76s)
it. Even if I had described all these
[10:15] (615.68s)
small interactions myself, cursor
[10:17] (617.76s)
wouldn't have gotten it right. Now I
[10:19] (619.44s)
start on the next section. I open a new
[10:21] (621.60s)
tab and give it the same prompt. This
[10:23] (623.84s)
time I ask for the permission request
[10:25] (625.84s)
screen instead of the welcome screen.
[10:27] (627.92s)
You go one by one through the whole app.
[10:30] (630.16s)
You can't implement all the animations
[10:32] (632.08s)
at once. Even if you try, cursor won't
[10:34] (634.40s)
remember everything and the models will
[10:36] (636.08s)
mess it up. It's definitely recommended
[10:37] (637.92s)
to do it one at a time. When you
[10:39] (639.60s)
encounter errors, capture a screenshot
[10:41] (641.76s)
and send it to cursor. Make sure to tell
[10:43] (643.76s)
it to use the context 7 MCP again. That
[10:46] (646.80s)
way, it becomes contextually aware of
[10:49] (649.04s)
how to fix the errors. It performs much
[10:51] (651.36s)
better with context 7 MCP because it
[10:53] (653.92s)
pulls the exact implementation and code
[10:56] (656.08s)
needed to fix the problem. And to show
[10:58] (658.24s)
you the final result, this is what it
[11:00] (660.16s)
looks like after editing everything.
[11:01] (661.92s)
It's looking pretty good. If I go back
[11:03] (663.68s)
to the homepage, you'll see the app icon
[11:05] (665.68s)
we implemented is also right here.
[11:07] (667.52s)
Opening it back up and going through the
[11:09] (669.28s)
onloading process, you'll notice some
[11:11] (671.20s)
jitters that I'll need to fix. This
[11:13] (673.12s)
isn't a single day process. It's
[11:14] (674.96s)
actually quite lengthy to fix everything
[11:16] (676.80s)
properly. Even with AI, where it used to
[11:19] (679.12s)
take about 6 months, it now takes about
[11:21] (681.12s)
a month to make fully polished products.
[11:23] (683.28s)
Moving forward, you'll see the
[11:24] (684.64s)
animations look really nice and
[11:26] (686.24s)
polished, especially on this screen. It
[11:28] (688.24s)
nailed this one. Next is the gallery
[11:30] (690.16s)
page. I haven't made any edits to it
[11:32] (692.16s)
yet, so I'll come back to that later.
[11:33] (693.92s)
Then we have the main picture swipe
[11:35] (695.60s)
screen. I got it into good condition,
[11:37] (697.44s)
but there's still a lot to implement.
[11:39] (699.12s)
The rest works well, and if you skip
[11:40] (700.88s)
ahead, it gives us a summary of what we
[11:43] (703.04s)
achieved. For example, when I delete
[11:44] (704.96s)
this photo and skip, it confirms that
[11:47] (707.04s)
one photo was deleted. You can see there
[11:48] (708.96s)
are still some UI issues, but it takes
[11:51] (711.04s)
time. What you need to do is use this
[11:52] (712.80s)
framework and fix everything one by one.
[11:55] (715.04s)
That's how you get really well-made,
[11:56] (716.72s)
polished apps and gain a competitive
[11:58] (718.56s)
edge over any competitors. That brings
[12:01] (721.04s)
us to the end of this video. If you'd
[12:02] (722.80s)
like to support the channel and help us
[12:04] (724.56s)
keep making videos like this, you can do
[12:06] (726.48s)
so by using the super thanks button
[12:08] (728.32s)
below. As always, thank you for watching
[12:10] (730.24s)
and I'll see you in the next one.