[00:00] (0.16s)
The most important thing about LLMs is
[00:02] (2.16s)
memory and context. Think about it.
[00:04] (4.08s)
We're living in a multiplatform world
[00:06] (6.00s)
where we constantly jump between
[00:07] (7.76s)
different AI tools. You have multiple
[00:09] (9.68s)
tabs open in cursor. You're
[00:11] (11.36s)
brainstorming in Claude or working on
[00:13] (13.28s)
another project in Windsurf. But here's
[00:15] (15.20s)
the problem. Your memory doesn't persist
[00:17] (17.12s)
across these platforms. Now, what if
[00:19] (19.20s)
there was a small memory box, a
[00:21] (21.04s)
universal container where you could
[00:22] (22.80s)
store all your context, preferences, and
[00:25] (25.28s)
project details and carry it with you
[00:27] (27.44s)
across every single AI agent and LLM you
[00:30] (30.56s)
use. This wasn't really possible before,
[00:32] (32.96s)
but ever since MCPS came around, it's
[00:35] (35.36s)
become incredibly easy to implement. And
[00:37] (37.36s)
if you're wondering whether it's hard to
[00:39] (39.04s)
set up, no, it's actually quite simple.
[00:41] (41.36s)
So, let's get into the video and I'll
[00:43] (43.04s)
show you the universal memory I'm
[00:44] (44.80s)
talking about. So, the tool I'm about to
[00:47] (47.20s)
show you is actually called Super Memory
[00:49] (49.20s)
MCP, and it's built on top of the Super
[00:51] (51.36s)
Memory API. I'll explain how it works in
[00:53] (53.68s)
a bit, but essentially it gives you your
[00:55] (55.60s)
own personal and universal memory MCP.
[00:58] (58.08s)
And the problem they're solving is
[00:59] (59.60s)
actually pretty interesting. They point
[01:01] (61.36s)
out that everyone is building their own
[01:03] (63.36s)
memory layer. So, as I already
[01:05] (65.12s)
mentioned, why not just carry it around
[01:06] (66.96s)
with us? They basically give us this
[01:08] (68.96s)
MCP. You just install it for whichever
[01:11] (71.36s)
client you're using. You can see all the
[01:13] (73.36s)
MCP clients listed here. and then go
[01:15] (75.60s)
ahead and collect memories from those
[01:17] (77.36s)
clients. It stores them seamlessly. And
[01:19] (79.52s)
if a memory exists in one client, it
[01:21] (81.76s)
will automatically be available in the
[01:23] (83.52s)
others as well. This MCP is essentially
[01:26] (86.32s)
giving you a memory box where you can
[01:28] (88.32s)
store all your memories, whether they're
[01:30] (90.40s)
related to a specific project you're
[01:32] (92.40s)
working on or just general preferences
[01:34] (94.40s)
you want to keep. You can store any type
[01:36] (96.32s)
of memory and use it wherever you want.
[01:38] (98.56s)
Now, you might be thinking, how would
[01:40] (100.16s)
you actually use this in your workflow?
[01:42] (102.08s)
Well, there are two main ways. First, if
[01:44] (104.32s)
you're working in any IDE and want to
[01:46] (106.40s)
discuss something about your project
[01:48] (108.00s)
with the claw desktop app, you can have
[01:50] (110.24s)
persistent memory there, so your context
[01:52] (112.56s)
stays preserved. Or for example, if you
[01:54] (114.80s)
have multiple windows of your AI IDE
[01:57] (117.52s)
open and want the context to persist
[01:59] (119.60s)
across them, you can use this tool as
[02:01] (121.76s)
well. Let me show you an example. I'm in
[02:04] (124.00s)
cursor right now and I just told it that
[02:05] (125.92s)
I'd like to build a simple water tracker
[02:08] (128.08s)
app for plants. The first thing I
[02:09] (129.92s)
mentioned was that the design of the app
[02:11] (131.84s)
should be a 2D pixelated style. Then I
[02:14] (134.24s)
gave it a few other specifications like
[02:16] (136.40s)
using Nex.js and asked it to search for
[02:18] (138.88s)
React libraries that could help create
[02:21] (141.04s)
pixelated components. I also gave it a
[02:23] (143.44s)
couple of additional requirements. So it
[02:25] (145.44s)
went ahead and did its thing, found some
[02:27] (147.28s)
libraries and gave me a response. I told
[02:29] (149.44s)
it which specific library I wanted to
[02:31] (151.52s)
use for generating the pixelated plant
[02:33] (153.68s)
images and cursor built a small little
[02:35] (155.92s)
app for me. There were a few errors
[02:37] (157.68s)
along the way, but I just copy pasted
[02:39] (159.76s)
them into the chat and cursor fixed them
[02:41] (161.76s)
right away. This is the water tracker
[02:43] (163.68s)
app it created. Right now, you can see
[02:45] (165.84s)
I've clicked on all of the plants. So,
[02:47] (167.76s)
it says they were last watered 0 days
[02:49] (169.92s)
ago. They have 100% health and they've
[02:52] (172.40s)
all been watered. This is the pixelated
[02:54] (174.32s)
design it came up with. It's a cute
[02:56] (176.08s)
little interface, but for now, it's just
[02:58] (178.08s)
a basic MVP and still missing a lot of
[03:00] (180.48s)
features. So, here's what I did next. I
[03:02] (182.40s)
asked cursor to generate a description
[03:04] (184.32s)
of the initial prompt I had given the
[03:06] (186.16s)
tools used to build the app and the
[03:08] (188.00s)
structure and features of the MVP. Then
[03:10] (190.08s)
I told it to add all of that to memory.
[03:12] (192.00s)
It went ahead and created a project
[03:13] (193.68s)
summary of everything built so far and
[03:16] (196.00s)
then called the MCP tool. If you scroll
[03:18] (198.32s)
down, you'll see the second memory that
[03:20] (200.40s)
was added. It details the development of
[03:22] (202.48s)
the pixelated plant water tracker, what
[03:24] (204.64s)
the MVP included and the technologies
[03:26] (206.96s)
used. So whenever I want to revisit or
[03:29] (209.12s)
expand the app, all the context is
[03:31] (211.04s)
already stored. For example, I went into
[03:33] (213.20s)
Claude and said, "I've built the plant
[03:35] (215.04s)
water tracker. Pull the context from the
[03:37] (217.12s)
MCP because I want to add new features."
[03:39] (219.44s)
Claude searched super memory, retrieved
[03:41] (221.60s)
the full project details, and gave me a
[03:43] (223.76s)
list of potential new features. Then I
[03:45] (225.84s)
told it the specific ones I wanted.
[03:47] (227.68s)
Plant management, a separate page for
[03:49] (229.60s)
the plant library, and some gamification
[03:51] (231.84s)
features. Claude then organized those
[03:53] (233.92s)
into a categorized list of features we
[03:56] (236.16s)
could add. But I said the list was too
[03:57] (237.92s)
extensive and asked for just the
[03:59] (239.84s)
essential features to expand the app
[04:01] (241.84s)
step by step. So it created a more
[04:04] (244.00s)
focused feature list. I then told it to
[04:06] (246.24s)
add the refined list to memory. And once
[04:08] (248.48s)
again it stored it in super memory. Now
[04:11] (251.12s)
if you check the first memory, it shows
[04:13] (253.12s)
a document outlining the planned
[04:15] (255.04s)
features for the pixelated water tracker
[04:17] (257.28s)
app categorized into three main areas
[04:19] (259.68s)
with several features under each. And
[04:21] (261.76s)
finally, back in cursor, I told it to
[04:24] (264.00s)
search memory again to implement some of
[04:25] (265.92s)
the new features already logged. As
[04:27] (267.92s)
expected, it called the MCP tool,
[04:30] (270.08s)
checked the memory directory, and
[04:31] (271.76s)
started planning how to implement those
[04:33] (273.60s)
features. Oh, and if you're enjoying the
[04:35] (275.84s)
content we're making, I'd really
[04:37] (277.36s)
appreciate it if you hit that subscribe
[04:39] (279.20s)
button. We're also starting to test out
[04:41] (281.20s)
channel memberships to help support what
[04:43] (283.12s)
we're building here. Right now, we've
[04:44] (284.88s)
only launched the first tier, and it
[04:46] (286.64s)
gives you priority replies to your
[04:48] (288.48s)
comments. So, if you ever have a
[04:50] (290.00s)
question or want feedback, you'll get
[04:51] (291.68s)
bumped to the front of the line. If you
[04:53] (293.76s)
think that telling cursor or any other
[04:55] (295.68s)
IDE to add your project description into
[04:58] (298.24s)
memory again and again is going to get
[05:00] (300.24s)
redundant, there's a fix for that, too.
[05:02] (302.16s)
Just head into your settings. This might
[05:04] (304.16s)
vary depending on the IDE, but go ahead
[05:06] (306.40s)
and add project specific rules. As you
[05:08] (308.80s)
can see here, I've added a rule. And if
[05:10] (310.72s)
I open it, you'll see that I've
[05:12] (312.16s)
instructed it. If any changes are made
[05:14] (314.08s)
to the app, even structural changes,
[05:16] (316.24s)
dependency updates, or text tag edits,
[05:18] (318.64s)
it should automatically upload and
[05:20] (320.40s)
update those changes in memory. You can
[05:22] (322.72s)
also set the rule type to always, so
[05:25] (325.04s)
it's automatically attached to every
[05:26] (326.88s)
chat within the agent. That way, anytime
[05:29] (329.28s)
changes are made to your app, they're
[05:31] (331.04s)
uploaded to memory without you having to
[05:33] (333.04s)
do it manually each time. Now, if you're
[05:35] (335.52s)
wondering how this actually works,
[05:37] (337.20s)
basically the MCP has two tools that you
[05:39] (339.60s)
can see. the add to super memory tool
[05:41] (341.68s)
and the search super memory tool. The
[05:43] (343.76s)
add to super memory tool simply takes
[05:46] (346.00s)
whatever text you give it and adds it to
[05:48] (348.16s)
memory exactly as it is. Then the search
[05:50] (350.72s)
super memory tool takes a keyword based
[05:53] (353.04s)
on what you're looking for, queries your
[05:55] (355.04s)
entire memory and returns any memories
[05:57] (357.36s)
that are related to that keyword. So
[05:59] (359.28s)
that's how the whole MCP server works.
[06:01] (361.52s)
On the back end, it uses the super
[06:03] (363.52s)
memory API which is a memory layer for
[06:05] (365.92s)
LLMs that you can actually use in your
[06:07] (367.92s)
own code. The creator built the MCP
[06:10] (370.16s)
using this API and it handles both
[06:12] (372.40s)
storage and querying of the memory base
[06:14] (374.56s)
through it. It's a solid memory layer,
[06:16] (376.56s)
fast, efficient, and well-built. So, the
[06:18] (378.64s)
back end it's running on is quite strong
[06:20] (380.56s)
as well. So, at this point, you might be
[06:22] (382.96s)
wondering, how do you actually install
[06:24] (384.72s)
the tool? On the tools website, you'll
[06:26] (386.80s)
see a unique MCP URL. This URL is
[06:30] (390.00s)
specific to you and is what's used to
[06:31] (391.92s)
store your memories. If you scroll down,
[06:34] (394.16s)
you'll find installation commands for
[06:36] (396.00s)
different clients like claude, cursor,
[06:38] (398.08s)
and other generic AI applications that
[06:40] (400.32s)
support MCPS. I've already installed it
[06:42] (402.72s)
for Claude and Cursor, and the tools are
[06:44] (404.64s)
available and working. Now, let's say I
[06:46] (406.72s)
want to install it for Windsurf as well.
[06:48] (408.88s)
All you need to do is copy the command,
[06:51] (411.20s)
then open your terminal. In the
[06:52] (412.72s)
terminal, just paste the entire command.
[06:55] (415.04s)
Now, for some reason, there's an issue
[06:56] (416.72s)
with the default installation command.
[06:58] (418.72s)
If I run it as is, I get an error and it
[07:01] (421.12s)
doesn't install. So, here's the fix.
[07:02] (422.96s)
Just paste the command again, remove the
[07:04] (424.96s)
extra I, and then run it. Now, it works.
[07:07] (427.28s)
It'll prompt you for a confirmation.
[07:09] (429.12s)
Just go ahead and accept it. Once that's
[07:11] (431.12s)
done, you'll get a confirmation that the
[07:12] (432.96s)
MCP server has been successfully
[07:14] (434.96s)
installed in Windinsurf. If you refresh
[07:16] (436.96s)
Windsurf, you'll see that the MCP server
[07:19] (439.12s)
has been added. That brings us to the
[07:21] (441.12s)
end of this video. If you'd like to
[07:22] (442.72s)
support the channel and help us keep
[07:24] (444.40s)
making tutorials like this, you can do
[07:26] (446.48s)
so by using the super thanks button
[07:28] (448.32s)
below. As always, thank you for watching
[07:30] (450.80s)
and I'll see you in the next one.