[00:00] (0.16s)
Cursor has a limit on how big your
[00:02] (2.16s)
project can be because of the model's
[00:03] (3.92s)
context size. Everyday people struggle
[00:06] (6.24s)
with things not working or cursor
[00:08] (8.32s)
messing up their projects. I found
[00:10] (10.16s)
something better. It's called Planex, a
[00:12] (12.48s)
command line coding agent built for
[00:14] (14.48s)
large-scale projects. In this video,
[00:16] (16.48s)
I'll show you how to install it, how it
[00:18] (18.56s)
works, and what you can do with it.
[00:20] (20.40s)
Let's get started. The special thing
[00:22] (22.00s)
about Planex is that it can handle up to
[00:24] (24.16s)
2 million tokens of context directly,
[00:26] (26.48s)
which is a lot. It can also index
[00:28] (28.32s)
directories with up to 20 million tokens
[00:30] (30.48s)
or more. This is possible because it
[00:32] (32.40s)
uses tree sitter project maps, a
[00:34] (34.72s)
relatively new feature in code editors
[00:37] (37.28s)
that helps you navigate code better. Not
[00:39] (39.68s)
only that, it uses multiple models
[00:42] (42.00s)
through the open router API key and
[00:44] (44.40s)
automatically picks the best one at any
[00:46] (46.48s)
given time. This is why they say it's
[00:48] (48.40s)
designed to be resilient for large code
[00:50] (50.24s)
bases. Let's go ahead and see how to
[00:52] (52.24s)
install it. Now, let's talk about the
[00:53] (53.76s)
installation options. If you're on
[00:55] (55.52s)
Windows, you need WSL or it won't work.
[00:58] (58.64s)
There are three ways to run it. First,
[01:00] (60.80s)
you can use Planex Cloud where you don't
[01:02] (62.96s)
need separate API keys. Everything runs
[01:05] (65.20s)
in the cloud and you can get started
[01:06] (66.72s)
quickly. The quick start guides are in
[01:08] (68.72s)
the GitHub repo and I'll link it below.
[01:11] (71.04s)
Second, you can use Planex Cloud with
[01:12] (72.96s)
your own API keys. You bring your own
[01:15] (75.12s)
keys but still use the cloud service.
[01:16] (76.96s)
Third is the self-hosted local mode
[01:19] (79.20s)
where you run Planex yourself with
[01:20] (80.88s)
Docker and use your own API keys. In
[01:23] (83.28s)
this demo, we'll be working with the
[01:24] (84.72s)
local mode. Let's set it up locally.
[01:26] (86.96s)
This is the local mode quick start
[01:28] (88.56s)
guide. It's linked in the GitHub repo
[01:30] (90.88s)
and I'll also put it in the description
[01:32] (92.64s)
below. First, you need to clone the
[01:34] (94.64s)
GitHub repo and start the server. As you
[01:37] (97.36s)
can see, I have pasted in the command
[01:39] (99.76s)
and now my server is running locally for
[01:42] (102.00s)
Planex. Before you paste this command,
[01:44] (104.56s)
make sure Docker is installed, set up,
[01:46] (106.72s)
and running. Otherwise, it will throw an
[01:49] (109.12s)
error. The next command goes into a new
[01:50] (110.96s)
terminal to install the Planex CLI. You
[01:53] (113.68s)
can see I entered the command and now
[01:55] (115.60s)
Planex CLI is installed. It will attempt
[01:57] (117.92s)
to use pseudo during installation. So
[02:00] (120.24s)
you'll need to enter your password as
[02:01] (121.84s)
well. After that, in the same terminal,
[02:04] (124.16s)
you'll sign into Planex. It will create
[02:06] (126.32s)
a user for you because you're running it
[02:08] (128.08s)
locally. Just copy and paste the
[02:10] (130.08s)
command. After pasting, it will ask how
[02:12] (132.08s)
you're using Planex. Remember the three
[02:14] (134.00s)
options I mentioned? We're selecting
[02:15] (135.84s)
local mode. You'll see a host address.
[02:18] (138.08s)
If you look back, the server command is
[02:20] (140.08s)
running Planex on a specific port which
[02:22] (142.64s)
shows up in the default option. Just
[02:24] (144.56s)
press enter. It will create a user sign
[02:26] (146.96s)
you in. And now to start Planex, you
[02:29] (149.28s)
just use the Planex command. It will
[02:31] (151.12s)
spin up a ripple for you in the project
[02:33] (153.04s)
directory you want to work in. Now you
[02:34] (154.72s)
can see I'm in my desired directory. And
[02:36] (156.80s)
I want to initialize Pandex. Before you
[02:39] (159.20s)
do that, you need to expose your open
[02:41] (161.28s)
router API key and your Open AI API key.
[02:44] (164.64s)
That's what Pandex will use. After you
[02:46] (166.72s)
have your API keys exposed, you can
[02:48] (168.96s)
initialize Planex in any repo you want
[02:51] (171.60s)
using this command or a shortened
[02:53] (173.60s)
version as well. Both work the same way
[02:55] (175.60s)
and will initialize Planex. For more
[02:58] (178.08s)
commands, you can check the instructions
[03:00] (180.16s)
here. There are also other commands you
[03:02] (182.16s)
can use like setting the configuration
[03:04] (184.16s)
and changing Planex to auto mode. I'll
[03:06] (186.72s)
explain what auto mode is in a moment.
[03:08] (188.64s)
Right now, we're in chat mode. To enable
[03:10] (190.96s)
mode, which starts writing code, we use
[03:13] (193.20s)
this command. There is also a multi-line
[03:15] (195.44s)
reading mode, but it is disabled for
[03:17] (197.52s)
now. Planex has actually provided a
[03:19] (199.60s)
pretty detailed demo. I will demo it
[03:21] (201.68s)
here myself, but let me show you how
[03:23] (203.36s)
Plex works through it. This is how Plex
[03:25] (205.68s)
works. You start in chat mode and
[03:27] (207.76s)
whatever you want to build, you tell it
[03:29] (209.76s)
your ideas and brainstorm with it. Even
[03:31] (211.68s)
if you don't know anything about the
[03:33] (213.12s)
tech stack you're going to use, just
[03:35] (215.12s)
flesh out your ideas. You don't need to
[03:37] (217.20s)
have everything planned from the start.
[03:38] (218.96s)
If you provide a ready-made project,
[03:41] (221.04s)
Planex can go through the files and
[03:43] (223.12s)
figure out where everything should go
[03:44] (224.96s)
thanks to the way it handles large
[03:46] (226.64s)
context sizes. When you're ready to
[03:48] (228.48s)
code, you can switch to Tel mode. It
[03:50] (230.80s)
will automatically ask you and once
[03:52] (232.88s)
enabled, it will start the
[03:54] (234.40s)
implementation. Just like many newer
[03:56] (236.40s)
tools, Planex breaks down the main task
[03:58] (238.88s)
into smaller steps, each focused on a
[04:01] (241.28s)
single goal, and it works through them
[04:03] (243.28s)
one by one. Another thing is that
[04:05] (245.20s)
whatever changes or files Planex creates
[04:08] (248.64s)
happen inside a sandbox. After every
[04:11] (251.28s)
tell mode session, you get a prompt
[04:13] (253.60s)
where you can review the changes, apply
[04:15] (255.92s)
them or reject them. Both versions of
[04:18] (258.00s)
the file stay separate until you approve
[04:20] (260.00s)
the changes. This means you can test
[04:22] (262.00s)
what you built first. Like cursor, you
[04:24] (264.08s)
can also run commands and set things up.
[04:25] (265.84s)
Another feature is debugging. If any
[04:27] (267.92s)
commands fail after you accept them, you
[04:30] (270.24s)
can turn on full auto mode, which will
[04:32] (272.32s)
try different fixes by itself. But be
[04:34] (274.40s)
aware, full auto mode will use a lot of
[04:36] (276.64s)
tokens and burn through your open router
[04:38] (278.88s)
and API key credits, which can get
[04:40] (280.96s)
expensive. I've opened a Swift app I
[04:42] (282.96s)
made using Gemini 2.5 in a previous
[04:45] (285.68s)
video and told Plandex that this is the
[04:48] (288.00s)
project. It read everything, recognized
[04:50] (290.24s)
that it's a Mac OS menu bar application
[04:52] (292.48s)
built with Swift and Swift UI, and laid
[04:54] (294.88s)
out the application overview,
[04:56] (296.40s)
architecture, key components, and user
[05:03] (303.88s)
flow. It now understands how the app
[05:06] (306.56s)
works. Let's see if we can make some
[05:08] (308.56s)
changes to the app. Here's the prompt
[05:10] (310.56s)
I'm going with. I'm asking it to improve
[05:12] (312.80s)
the UI for the Swift app, which is
[05:15] (315.12s)
usually hard for AI models. If the
[05:17] (317.44s)
implementation isn't done step by step,
[05:19] (319.60s)
they struggle because they're not deeply
[05:21] (321.52s)
trained on Swift and can't retain
[05:23] (323.44s)
context well for the code. Let's see how
[05:25] (325.60s)
it performs. You can see it reasoned
[05:27] (327.36s)
through the project and is now asking to
[05:29] (329.36s)
switch to tell mode. Let's do that. Now,
[05:31] (331.68s)
it's asking if we want to send another
[05:33] (333.36s)
prompt or begin the implementation.
[05:35] (335.36s)
Let's begin the implementation. You can
[05:37] (337.28s)
see it built the plan and presented me
[05:39] (339.44s)
with this menu. It didn't change the
[05:41] (341.28s)
main files, but created a new script to
[05:43] (343.68s)
automatically build the app. I pressed A
[05:45] (345.84s)
to apply the changes and now it's asking
[05:48] (348.00s)
if I want to execute. I said yes, and
[05:50] (350.40s)
now it's executing. It will build the
[05:52] (352.32s)
app, get the logs, and make changes
[05:54] (354.56s)
based on that. This isn't a big error,
[05:56] (356.48s)
but I want to show you something. If a
[05:58] (358.40s)
command fails, it gives you a menu where
[06:00] (360.40s)
you can choose to debug once or debug in
[06:02] (362.96s)
full auto mode. That's what I'm going to
[06:04] (364.72s)
use right now. You can see that another
[06:06] (366.56s)
plan has been built and now it's
[06:08] (368.32s)
applying the changes. Once all the
[06:10] (370.16s)
changes are done, I'll show you what it
[06:12] (372.08s)
has made. The app was built and I
[06:14] (374.08s)
compiled it using Xcode. This is what it
[06:16] (376.48s)
was able to build. I can now go into
[06:18] (378.40s)
settings, click on the accent color, and
[06:20] (380.72s)
adjust it to any color I want. It's
[06:22] (382.72s)
using the built-in Mac OS and Swift UI
[06:24] (384.88s)
components, which is great. You can now
[06:26] (386.88s)
change the accent color to anything like
[06:28] (388.80s)
making it blue. I made the original app
[06:30] (390.80s)
in a previous video using Gemini 2.5,
[06:33] (393.60s)
and that was the only model that could
[06:35] (395.12s)
do it. This is probably why the tool
[06:37] (397.20s)
succeeded because it uses multiple
[06:39] (399.36s)
models from Open Router. Everything
[06:41] (401.36s)
works and it looks pretty nice. That's
[06:43] (403.36s)
it for this video. If you liked it,
[06:45] (405.36s)
please consider subscribing if you want
[06:47] (407.20s)
us to keep making these videos. And
[06:49] (409.12s)
since my wallet is running a bit empty,
[06:51] (411.44s)
please consider donating using the link
[06:53] (413.44s)
below. Thanks as always for watching and
[06:55] (415.84s)
I'll see you in the next