[00:00] (0.16s)
The UI design industry as we know it is
[00:02] (2.48s)
dead. And if you're still hiring or
[00:04] (4.32s)
planning to become a traditional UI
[00:06] (6.24s)
designer in 2025, you're already falling
[00:08] (8.80s)
behind. The workflows have been
[00:10] (10.40s)
completely transformed due to
[00:12] (12.08s)
revolutionary new models like Claude for
[00:14] (14.40s)
Sonnet and Opus along with the
[00:16] (16.16s)
incredible tools built on top of them.
[00:18] (18.16s)
Claude Code, Cursor, and other agentic
[00:20] (20.64s)
IDs. Here's what most people don't
[00:22] (22.40s)
realize. The majority of designers and
[00:24] (24.40s)
developers aren't using these new
[00:26] (26.16s)
workflows that I'm about to show you
[00:27] (27.84s)
today. And that's exactly what gives you
[00:29] (29.68s)
the competitive edge. These workflows
[00:31] (31.76s)
enable you to create amazing UI
[00:34] (34.00s)
prototypes and transform them into
[00:35] (35.84s)
completely functional applications that
[00:37] (37.76s)
you can actually sell and build entire
[00:39] (39.76s)
businesses around. Claude 4 has been
[00:41] (41.76s)
available for some time now, and what
[00:43] (43.52s)
I've enjoyed most about it is its
[00:45] (45.52s)
ability to quickly and accurately
[00:47] (47.44s)
generate exactly the designs I want.
[00:49] (49.76s)
Whether you're using Claude code or any
[00:51] (51.68s)
other agentic ID, Claude 4 never stops
[00:54] (54.40s)
surprising me with its capabilities. Let
[00:56] (56.40s)
me show you a perfect example. Apple has
[00:58] (58.80s)
just introduced their new software
[01:00] (60.48s)
design with the latest version of iOS
[01:02] (62.72s)
and it's absolutely beautiful and
[01:04] (64.56s)
amazing for mobile apps, especially iOS
[01:07] (67.20s)
apps. It looks incredible. All the
[01:09] (69.12s)
examples I'm seeing here actually made
[01:11] (71.04s)
me think, why not build an app and try
[01:12] (72.96s)
to implement this myself? As you know,
[01:15] (75.04s)
you can create iOS apps yourself. You
[01:17] (77.04s)
just need to get them approved and then
[01:18] (78.56s)
you can actually sell them. So, that's
[01:20] (80.16s)
exactly what I did. Right here, you can
[01:22] (82.00s)
see the app I built. It's fully
[01:23] (83.68s)
functional. The animations have been
[01:25] (85.36s)
implemented directly and it features
[01:27] (87.20s)
these beautiful iOS style transitions
[01:29] (89.60s)
which look absolutely stunning and
[01:31] (91.44s)
impressive. You can see the animations
[01:33] (93.04s)
right here. And the amazing part is I
[01:35] (95.12s)
can go ahead and upload it to the app
[01:36] (96.72s)
store right away. Now in this video I'm
[01:38] (98.80s)
going to show you how powerful Claude 4
[01:41] (101.04s)
is in UI workflows and how you can
[01:43] (103.20s)
implement any type of app whether it's
[01:45] (105.20s)
an iOS app or a web app using this
[01:47] (107.44s)
workflow to build anything you want. Now
[01:49] (109.76s)
the question arises, what should we use
[01:51] (111.92s)
to build these amazing apps on this
[01:54] (114.08s)
amazing device? And the answer is you
[01:56] (116.08s)
can use pretty much anything you want.
[01:57] (117.76s)
If you want to use cursor, you can use
[01:59] (119.44s)
cursor. If you want to use claw code,
[02:01] (121.44s)
then you can use claw code as well. It's
[02:03] (123.44s)
honestly up to you and what you prefer.
[02:05] (125.20s)
There are a few differences between
[02:06] (126.56s)
them. Claude code has a larger context
[02:08] (128.72s)
limit and uses API credits directly, but
[02:11] (131.28s)
cursor has solved that with their new
[02:13] (133.12s)
max mode, which allows you to use models
[02:15] (135.44s)
with full context when enabled at their
[02:17] (137.44s)
API pricing. So you get the full context
[02:19] (139.68s)
limit without being restricted by
[02:21] (141.28s)
cursor's previous context limitations.
[02:23] (143.28s)
After that, it honestly comes down to
[02:25] (145.20s)
your own preference. Some people like a
[02:27] (147.28s)
certain type of UI because it's visually
[02:29] (149.44s)
clearer, the files are easy to navigate,
[02:31] (151.76s)
and it's more comfortable for them to
[02:33] (153.44s)
code in that environment. Others, like
[02:35] (155.60s)
me, prefer to have their coding sessions
[02:37] (157.68s)
directly in the terminal, which is why I
[02:39] (159.52s)
lean toward cloud code. Other than that,
[02:41] (161.60s)
there aren't many major differences.
[02:43] (163.36s)
Claude Code also now supports adding
[02:45] (165.44s)
images, so that's been addressed, too.
[02:47] (167.36s)
Not many distinctions remain at this
[02:49] (169.20s)
point. Now let's take this diet app as
[02:51] (171.52s)
an example. You can see that it has a
[02:53] (173.52s)
beautiful UI. So if you wanted to clone
[02:56] (176.08s)
this UI and turn it into a fully
[02:58] (178.00s)
functional app, how would you do that?
[02:59] (179.84s)
Well, let me show you how. You can
[03:01] (181.68s)
clearly see that I gave it the image.
[03:03] (183.44s)
And yes, this is another amazing
[03:05] (185.04s)
feature. You can now just drop images
[03:07] (187.12s)
directly into cloud code and it can use
[03:09] (189.28s)
them. So I dropped the image and asked
[03:11] (191.36s)
it to look at it and clone all the pages
[03:13] (193.60s)
into an HTML file. I also specified that
[03:16] (196.48s)
I wanted the pages displayed separately,
[03:18] (198.72s)
not as a whole prototype app, and it
[03:20] (200.80s)
just started working on it. Another
[03:22] (202.40s)
amazing thing about Claude Code is that
[03:24] (204.32s)
it creates its own to-do list and
[03:26] (206.40s)
executes tasks one by one. This is
[03:28] (208.72s)
something I really like about Claude
[03:30] (210.32s)
Code. It doesn't require a separate task
[03:32] (212.48s)
manager, unlike cursor where you often
[03:34] (214.64s)
need external tools. In fact, in our own
[03:37] (217.12s)
videos, we've shown how people try to
[03:39] (219.04s)
make cursor follow a task list and then
[03:41] (221.20s)
build something structured from that.
[03:42] (222.80s)
You can see here it just started
[03:44] (224.64s)
creating and crossed off one task after
[03:46] (226.80s)
another. Eventually it generated a file
[03:49] (229.12s)
called fitnessapp.html
[03:51] (231.36s)
with all the screens in one file. Now
[03:53] (233.44s)
let me actually show you how it cloned
[03:55] (235.20s)
the UI which I think looks amazing. This
[03:57] (237.28s)
is the UI it ended up generating and I
[03:59] (239.60s)
think it looks great. This is just a
[04:01] (241.20s)
prototype built purely using HTML and
[04:04] (244.00s)
CSS. It's not functional yet. To make it
[04:06] (246.56s)
functional, I actually went ahead and
[04:08] (248.32s)
used cursor. As you know, this is on a
[04:10] (250.24s)
paid plan and I have to pay per API
[04:12] (252.40s)
usage. So for straightforward tasks like
[04:14] (254.88s)
implementing already planned structures,
[04:16] (256.96s)
I sometimes just use cursor. If the
[04:19] (259.04s)
structure is already made, it executes
[04:20] (260.96s)
it efficiently. I don't have to worry
[04:22] (262.72s)
about flawed code due to its smaller
[04:24] (264.80s)
context limit. It builds a solid
[04:26] (266.64s)
structure without needing constant
[04:28] (268.40s)
additional context. Now here in cursor
[04:31] (271.12s)
you can clearly see that I opened the
[04:33] (273.04s)
app and since my prototype was already
[04:35] (275.12s)
available I simply asked it to convert
[04:37] (277.44s)
the prototype into a next.js app. I also
[04:40] (280.08s)
asked it to make sure the intended
[04:41] (281.92s)
functionality was preserved which at
[04:43] (283.84s)
this point was primarily just the
[04:45] (285.60s)
navigation. Most of the other
[04:47] (287.12s)
functionality hadn't been implemented
[04:48] (288.88s)
yet. So that's what we focused on. As
[04:51] (291.12s)
you can see it accurately cloned the UI
[04:53] (293.60s)
and it looks really good. The navigation
[04:55] (295.52s)
is fully functional. We have our pages
[04:57] (297.68s)
in place. It even added a bit of
[04:59] (299.44s)
animation as you can see right here. And
[05:01] (301.60s)
overall, everything seems to be working
[05:03] (303.52s)
well. Now, that was just a simple web
[05:05] (305.52s)
app and I converted it into a Next.js
[05:07] (307.68s)
app. But what if you want to build
[05:09] (309.12s)
something functional like the iOS app I
[05:11] (311.36s)
showed you at the beginning? Here's the
[05:12] (312.96s)
process you're going to follow. The
[05:14] (314.64s)
thing is, you don't need to learn any
[05:16] (316.56s)
specific framework anymore. You can just
[05:18] (318.56s)
go ahead and build these apps, but it's
[05:20] (320.56s)
important to design the UI first so you
[05:23] (323.04s)
don't run into issues after everything
[05:25] (325.04s)
else has been set up. So I just
[05:26] (326.88s)
initialized claude in my terminal. After
[05:28] (328.88s)
it was initialized, I told it that I
[05:30] (330.88s)
wanted to build a prototype of a recipes
[05:33] (333.04s)
app and I sent it the details. I also
[05:35] (335.28s)
mentioned that once the prototype was
[05:37] (337.04s)
ready, I would like to implement it in
[05:38] (338.88s)
Swift. Now you'll see there's this plan
[05:40] (340.72s)
mode and you can use shift plus tab to
[05:42] (342.96s)
cycle through the different modes. For
[05:44] (344.56s)
example, there's the autoac accept edits
[05:46] (346.64s)
mode which lets Claude autonomously work
[05:48] (348.72s)
in your repo just like how cursor
[05:50] (350.64s)
handles tasks for you. Then there's plan
[05:52] (352.72s)
mode where you simply discuss your ideas
[05:54] (354.80s)
and how you're going to implement them.
[05:56] (356.48s)
You can use either this or chat GPT. I
[05:58] (358.88s)
ended up using chat GPT to save tokens
[06:01] (361.44s)
and I think that's a good best practice.
[06:03] (363.28s)
Just chat in chat GPT, figure stuff out
[06:05] (365.60s)
there and then come back and give your
[06:07] (367.28s)
context here. The context matters and
[06:09] (369.44s)
doing this saves you tokens while
[06:11] (371.04s)
achieving the same effect. Now, let me
[06:12] (372.64s)
show you what prompt I used to build the
[06:14] (374.56s)
prototype and then how I converted that
[06:16] (376.56s)
prototype into a Swift UI app. And
[06:18] (378.48s)
finally, how I transformed that Swift UI
[06:20] (380.80s)
app into an incredibly polished liquid
[06:22] (382.96s)
UI version of the same app. If you like
[06:25] (385.44s)
the content we're making, I'd really
[06:27] (387.12s)
appreciate it if you could subscribe.
[06:28] (388.88s)
Right now, we're also testing out
[06:30] (390.64s)
memberships to support the channel.
[06:32] (392.32s)
We've only launched the first tier, and
[06:34] (394.16s)
for now, it offers priority comment
[06:36] (396.32s)
replies, but subscribing helps us see
[06:38] (398.40s)
how many of you are interested and want
[06:40] (400.16s)
to support what we're doing. First of
[06:42] (402.00s)
all, I just went to chat GPT and
[06:44] (404.00s)
discussed how I was going to convert the
[06:45] (405.84s)
HTML into an iOS app. Please ignore any
[06:48] (408.72s)
errors you see. I wasn't using an AI
[06:50] (410.88s)
transcriber. It was the native Mac OS
[06:53] (413.20s)
transcription. But this prompt shows you
[06:55] (415.04s)
that you need to discuss whatever you're
[06:56] (416.80s)
trying to do with an LLM. And especially
[06:58] (418.80s)
when working with external frameworks
[07:00] (420.64s)
like Swift for iOS, it's important to
[07:02] (422.80s)
ask it to perform a web search. Then
[07:04] (424.80s)
it'll give you a comprehensive solution
[07:06] (426.72s)
like in my case, how to convert HTML to
[07:09] (429.20s)
Swift UI. Once it explained the process,
[07:11] (431.36s)
I knew what I needed to do. After that,
[07:13] (433.44s)
I created a prompt for the basic app I
[07:15] (435.68s)
wanted to build. I shared the full
[07:17] (437.36s)
functionality and the features the app
[07:19] (439.36s)
would include. Based on that, it gave me
[07:21] (441.36s)
a solid project overview for an app
[07:23] (443.52s)
named kitchen recipes, which I later
[07:25] (445.52s)
renamed. It provided the entire UI
[07:27] (447.92s)
structure and Swift UI tags, which
[07:30] (450.16s)
helped plot how the app's code should be
[07:32] (452.48s)
structured, especially since it needed
[07:34] (454.40s)
to be implemented in Swift UI. So, the
[07:36] (456.80s)
process is simple. Go ahead and discuss
[07:38] (458.88s)
your idea, especially how you plan to
[07:41] (461.04s)
convert your basic prototype into a full
[07:43] (463.28s)
app. For example, if you want to convert
[07:45] (465.28s)
HTML into a React app, talk it through
[07:47] (467.60s)
and see how feasible it is. If there's a
[07:49] (469.68s)
straightforward approach, go ahead and
[07:51] (471.44s)
plan your app. Once I had all that
[07:53] (473.20s)
figured out, I gave the prompt to Claude
[07:55] (475.28s)
Code, and this is what it came up with.
[07:57] (477.12s)
It's a very basic version, but a really
[07:59] (479.28s)
welldone prototype. You can see
[08:01] (481.12s)
everything here is functional and works
[08:03] (483.04s)
exactly as intended. The categories
[08:04] (484.96s)
work. We can add a new recipe. all the
[08:07] (487.12s)
fields function properly and it even
[08:09] (489.12s)
added a favorites panel in the settings
[08:11] (491.20s)
section on its own. Overall, it's a
[08:13] (493.36s)
really nice and responsive design, an
[08:15] (495.44s)
excellent prototype. And now that I'm
[08:17] (497.20s)
satisfied and confident that I want to
[08:19] (499.12s)
build my full app on top of this, I can
[08:21] (501.36s)
move forward. Now, the next step is
[08:23] (503.28s)
actually converting this HTML prototype
[08:25] (505.84s)
into a fully functional Swift app. For
[08:27] (507.92s)
that, I had to open up Xcode and
[08:29] (509.84s)
initialize a new project. You can see
[08:31] (511.76s)
right here that this project called
[08:33] (513.68s)
Kitchen Delights was created in Xcode.
[08:36] (516.00s)
These are template projects. You'll
[08:37] (517.68s)
notice in Xcode you have predefined
[08:39] (519.84s)
templates for iOS and Mac OS as well.
[08:42] (522.48s)
This project is one of those templates.
[08:44] (524.32s)
I also cloned a Mac app using the same
[08:46] (526.64s)
process. So, if you're interested, feel
[08:48] (528.64s)
free to check that video out. And if
[08:50] (530.40s)
you'd like a tutorial on how Xcode
[08:52] (532.48s)
functions in general, you can look that
[08:54] (534.24s)
up, too. After setting up the template,
[08:56] (536.16s)
I needed a file to guide Claude Code on
[08:58] (538.56s)
how the conversion should be handled.
[09:00] (540.32s)
For that, I created a file named
[09:02] (542.24s)
convert.md. I went back into chat gpt
[09:05] (545.12s)
and told it that the prototype was ready
[09:07] (547.04s)
and now I needed to convert it. I
[09:08] (548.96s)
instructed it to search the web and get
[09:10] (550.88s)
the best practices for this kind of
[09:12] (552.72s)
conversion so that when claude code
[09:14] (554.64s)
started working, it would know exactly
[09:16] (556.64s)
how to translate one format into the
[09:18] (558.72s)
other. It gave me a pretty comprehensive
[09:20] (560.80s)
set of rules which I then placed into
[09:22] (562.88s)
the convert.md file. You can see the
[09:25] (565.44s)
file right here. This is what directed
[09:27] (567.44s)
claude code during the conversion
[09:29] (569.28s)
process ensuring there were no mistakes.
[09:31] (571.60s)
And now if I close this and show you the
[09:34] (574.08s)
result, you can see I have a fully
[09:35] (575.76s)
functional app. It looks exactly the
[09:37] (577.36s)
same as the prototype and everything
[09:39] (579.12s)
works. The navigation is functional.
[09:41] (581.04s)
Dark mode works and I'm currently
[09:42] (582.72s)
running this in an iPhone simulator. So
[09:44] (584.96s)
this is exactly how it would appear and
[09:46] (586.96s)
behave on a real device. So it's a solid
[09:49] (589.36s)
app and it looks great right now. This
[09:51] (591.20s)
is a fully working app that I can submit
[09:53] (593.28s)
to the app store as soon as it passes
[09:55] (595.04s)
review. But it's still missing the new
[09:56] (596.80s)
UI that Apple has introduced and I'm not
[09:59] (599.12s)
going to do anything manually to add it.
[10:01] (601.04s)
So, how am I going to implement this
[10:02] (602.88s)
completely new design using claude code?
[10:05] (605.36s)
In order to implement the new UI, I
[10:07] (607.60s)
needed to provide a visual reference as
[10:09] (609.68s)
well as a clear structure of how it was
[10:11] (611.68s)
supposed to look and behave, especially
[10:13] (613.60s)
how it was supposed to be animated. I
[10:15] (615.68s)
selected two images from Apple's
[10:17] (617.52s)
official website. One showing the side
[10:19] (619.52s)
panel, which displayed most of the card
[10:21] (621.44s)
types and toggle buttons, and another
[10:23] (623.28s)
showing the UI navigation bar that used
[10:25] (625.60s)
the liquid texture font. These visuals
[10:27] (627.76s)
represented the new Apple liquid glass
[10:29] (629.92s)
design. I then went back to chat GPT,
[10:32] (632.16s)
uploaded those images and explained that
[10:34] (634.24s)
this was the new Apple liquid glass
[10:36] (636.16s)
design. I asked it to give me a general
[10:38] (638.16s)
overview of the design language. It was
[10:40] (640.24s)
allowed to search the web and analyze
[10:42] (642.24s)
the images to produce an accurate
[10:44] (644.00s)
description. Once it gave me that, I
[10:45] (645.92s)
requested a full design language report,
[10:48] (648.48s)
one that explained not just how it
[10:50] (650.40s)
looked, but how it was animated. Because
[10:52] (652.40s)
this new liquid UI is heavily
[10:54] (654.24s)
animationbased and I wanted to replicate
[10:56] (656.48s)
that behavior as part of the final
[10:58] (658.56s)
implementation. You can see that the
[11:00] (660.72s)
exact prompt about how it looks and how
[11:02] (662.88s)
it animates is right here. I gave it to
[11:04] (664.96s)
Claude code along with the two reference
[11:06] (666.96s)
images and told it that this was the
[11:08] (668.96s)
visual reference. It immediately
[11:10] (670.64s)
recognized that I wanted to implement
[11:12] (672.48s)
the Apple liquid glass design and it
[11:14] (674.56s)
started working on it. Again, it created
[11:16] (676.80s)
its to-do list and began checking off
[11:18] (678.88s)
tasks one by one. I must say it did take
[11:21] (681.76s)
a little bit of time not just to build
[11:23] (683.68s)
everything out but to actually test it
[11:25] (685.60s)
as well. And once it was done there were
[11:27] (687.52s)
still some elements that hadn't been
[11:29] (689.20s)
fully addressed. So I asked it to make
[11:31] (691.12s)
sure the design was applied across all
[11:33] (693.12s)
components. There were also a few bugs.
[11:35] (695.36s)
So I provided images highlighting the
[11:37] (697.36s)
issues that really helped speed up the
[11:39] (699.44s)
debugging process. I asked it to fix all
[11:41] (701.76s)
the inconsistencies and polish
[11:43] (703.52s)
everything. And the result is the
[11:45] (705.04s)
amazing looking app you see right here.
[11:46] (706.96s)
You can see it's the same app, but now
[11:48] (708.56s)
it just looks so much better. Right now,
[11:50] (710.48s)
it's running in a simulated environment,
[11:52] (712.64s)
not an actual iOS device, so it's a bit
[11:54] (714.96s)
laggy, but otherwise it looks
[11:56] (716.48s)
incredible. You can see the categories
[11:58] (718.24s)
right here. If I want to add a recipe, I
[12:00] (720.48s)
can just go ahead and do that. The
[12:02] (722.00s)
settings panel works, and dark mode is
[12:04] (724.24s)
also supported. That beautiful glass
[12:06] (726.16s)
effect looks especially cool in dark
[12:08] (728.16s)
mode. The favorites feature works, too.
[12:10] (730.32s)
So, overall, the app looks and feels so
[12:12] (732.72s)
much better now, and it's fully
[12:14] (734.24s)
functional. Anyone can use it to track
[12:16] (736.24s)
their recipes. And if I wanted to, I
[12:18] (738.40s)
could go ahead and submit it to the
[12:19] (739.92s)
Apple App Store and actually sell it.
[12:21] (741.84s)
That's the real beauty of this. Anyone
[12:23] (743.76s)
can go ahead and start building their
[12:25] (745.44s)
own apps. And these aren't just small
[12:27] (747.20s)
demo apps. They're fully functional,
[12:29] (749.20s)
beautifully designed apps. And you can
[12:31] (751.04s)
build them using any framework you want
[12:33] (753.12s)
following the exact workflow I just
[12:35] (755.12s)
explained. Just remember, always provide
[12:37] (757.36s)
full context when working with these
[12:39] (759.12s)
LLMs, especially for conversions or
[12:41] (761.44s)
tools they might struggle with.
[12:42] (762.80s)
Honestly, they still struggle a bit with
[12:44] (764.96s)
anything beyond Python, JavaScript, and
[12:47] (767.44s)
their related frameworks. So, for other
[12:49] (769.60s)
languages, be sure to guide them with
[12:51] (771.52s)
proper context. And that's it. They do
[12:53] (773.44s)
amazing work for you. You can even build
[12:55] (775.28s)
entire businesses on top of this. That
[12:57] (777.60s)
brings us to the end of this video. If
[12:59] (779.44s)
you'd like to support the channel and
[13:01] (781.04s)
help us keep making tutorials like this,
[13:03] (783.20s)
you can do so by using the super thanks
[13:05] (785.20s)
button below. As always, thank you for
[13:07] (787.12s)
watching and I'll see you in the next