Learn how to setup AI development tool Cursor AI with the DeepSeek v3 and r1 Models. Build Apps and ...
Video Transcript:
so everyone is talking about deep seek I was in The Barbers the other day and it was a topic of conversation rarely is it that the very Niche things that I'm interested in become mainstream Media news so I'm going to do a video on uh deep seek coupled with our favorite coding IDE cursor should you use the two of them together how do you use the two of them together and I'm going to cover some of the recent updates to cursor as well and how you can use them to speak up your AI code Generation all that coming up okay so first my quick take on deep seek it's a really great model that originated out of China what's amazing is that it's open source when you're using deep seek. com or the chat version of it or their API you are sending your content your code your uh deep Secrets all the way over to some kind of processing center in China and the same goes for when you're sending your content to openai or any of the other models that you might work with um they have various degrees of privacy set up but you just never know but what's really cool about deep seek is they have a local model and a variation of local models that you can install if you've got good enough Hardware on your uh PC or your Mac to actually run the model without connecting in any way to the internet and I did that a couple of uh days ago and it actually works really well albe it it's slow enough on my 3060 but um it is really quite compelling and I think what we're going to see is people take the model the open weights put it onto their own hardware and servers and offer apis at a really discounted cost for everyone so what does this mean for us building software with AI it means uh prices are going to come down significantly uh I think there's still challenges that deep seek are not addressing in terms of the infrastructure and the scale that's needed um a lot of these kind of problems open AI has solved already along with anthropic uh so there's still some hurdles to get over but it definitely has shaken up the industry so all I would say is just be mindful of you know how you use the model what you're using the model for uh but have some fun with it as well okay so what I'm going to show you now is how it actually connects to cursor first of all you want to make sure you're running the most recent version so we can go up here to help and then about and you should see version 45. 4 um or uh anything above 45 if you don't have that you can hit control shift p and then you can type in cursor update you'll see it highlighted here or cursor attempt update run that and it should attempt to update for you if that doesn't work you can go to cursor.
comom download the most recent version and see if that works now it does say that this is a rolling update so it might not be available in your area but that's how you need to go and update it so a couple of different changes if we go over to our cursor settings up here in the top right and then we scroll down we can see we now have project rules in there I'll ignore that for a minute we'll come back to it but we have models so in the models you can enable any model that you want and now you've got deep seek or1 and deep seek V3 so deep seek version 3 is probably closest to Claude Sonet which I would use mostly for software development uh and also like GPT for um it's a more uh I guess it's a mixture of experts uh model and then you have uh deep seek R1 which is more equivalent to the 01 range it's got a Chain of Thought it'll actually show you as it thinks through things and I think it's priced at a higher level I'm on the $20 a month plan it's going to give you 500 Fast premium requests so basically a premium request is any model that's like Claude or uh 40 and I think now it also includes uh the Deep seek models so it's unclear if cursor are going to pass on some saving considering deep seek is a lot cheaper but you do have to take into account that cursor does a whole lot more than just make an API call on your behalf it's doing a lot of work in the background to do some contextualizing of your code base um and also using a variation of different models to achieve different things if you were to actually connect Claude or um open AI directly via API you'd end up running up a higher bill because cursor just does a great job of aggregating a lot of the calls and just making it more affordable so I'm still happy to pay $20 a month for the kind of productivity I'm getting so in order to actually get deep seek working we would go over here we can open the AI Pane and then let's say we pick a new composer by clicking the plus button here now we have the ability to select the model here whether it's deep seek version 3 or or one and then we have the ability to select normal or agent well we actually don't so deep seek models are not set up to work with the agent yet so what's the difference between a composer normal mode and agent well agent is the mode that I use most often because it essentially does all the work for you it basically can run through running terminal uh commands it can create all the files for you it looks at the linting errors it goes and it fixes all of those it does a lot more autonomous movement it really is an agent it makes decisions and moves forward and it's the one I enjoy the most so we not getting that functionality with deep seek yet so for me that's the reason why I won't be using it that much until it is actually fully engaged now I did work on a project where I used both deep seek V3 and deep seek R1 to um create out a project and I have to say it was very comparable to using Claude 3. 5 using V3 is slightly slower I found than um CLA 3. 5 but that just could depend on the service that they're using I think they're using fireworks.
a in terms of hosting the Deep seek model and having it work with cursor uh using R1 is really interesting uh because you can actually see the Chain of Thought as it works through things so let me say I want to say something like um let's add clerk authentication to this project don't generate any code just talk to me about how you would do that in terms of typing so I'm in normal mode I'm in deep seek version 3 I want to try out R1 and then I can click submit so what you're going to see here is this think so basically it showing the Chain of Thought for the model as it thinks through okay the user wants to add the authentification wants to do all this kind of stuff uh add the environment variables and then it gets to a point where it's going to actually make its suggestions you can see it's finished thinking we've got our recommendations that it has made okay so you might have seen here that there is this little icon and that's because in my cursor rules if I go over here I have this little context check which I use in the various different cursor rules files that I'm using to make sure that it's still in context and it's being applied so when you're working with models and you're sending lots of content your context window is only so big how much code it can remember and when you have longer conversations uh you can often your rules can fall out of context now I think cursor are working on this but I like to use different little emojis in the different files to make sure that I know that they're being kept into account um I've also added in little gotchas here like Shad Cen they always if you're using deep seek or clawed it will use the wrong command to install Shad CN components but in um GPT 4 it seems to work fine and then I like tidier commit messages so what it does here is it gives me this little um best practice for my commit messages and when it finishes it it's thinking it gives me the G command for whatever I was working on because one thing that's very important when you're working with AI is you're constantly committing okay so let's just take a pop over to the change log and see what else is new okay so the first change we have is a change to how cursor rules are set up so essentially it's setting up this directory structure so let's just pop over here back into cursor again so in the past what you did with cursor rules is used a cursor rules file so you would add in any common rules to that project that you wanted to um make sure that the model was applying like what stack you're using or any particular gotches um but now you can go over here to your settings again and then if I scroll down here we have specific project rules and I can add new rules here and here's one that I've added already I've added this rule that says use Prisma for all database operations and here's the actual rule that I want to apply because sometimes I find it gets confused between using the superbase client and using the Prisma client so I just want to make sure that we have one source of truth and how it it actually automatically creates this folder structure so we've do cursor SL rules and then we have this uh MDC file uh very much like how obsidian Works uh with your information and you can actually do um some globbing here as well but I haven't actually tested that out yet so that's a departure for how cursor rules normally work I imagine this is useful because in a cursor rules file there's probably various different um rules that I want to apply and maybe I want to componentize them and move them across projects or maybe I want to have them reference certain files when they're run and have different contexts at different times you know before I talked about using a design mode so you know when I work in design mode I want to scaffold or put the skeleton of the app together and in this case maybe I could just move this design mode as a rule and then I could just add the context of that rule when I want it to um work in in that way so that's basically uh at a high level the cursor rules so what else have we got so summarizing previous compon composers so it's important that when you're using your composer you'll run or you'll create a feature it's important to kind of refresh that or create a new composer and not just have this one long conversation because of course contexts can fall out and you can have problems there and what you didn't have before was the ability to have the new composer understand what happened before so what's happening now is it's basically summarizing all the previous composer windows uh using uh a model and then adding that to the context when you create the new composer so it brings forward that knowledge um everything is about managing your context and your memory okay so the agency's recent changes the agent can use a tool to see your recent changes it also sees changes made between user messages I'm not 100% sure on this one but I'm assuming that as the agent runs through its commands and its work you might be making independent shap changes in the background and it um is now able to pick up on that what I'd love for it to be able to do is not unlike how Klein works is that it can actually take a snapshot of what's been produced in your local host on the browser see any console errors and actually work to fix those so this is the feature I'd love to see which I like in uh clein better codebase understanding so they got a new Mo model for codebase understanding so um within VSS code or within cursor they have a way of indexing and understanding your uh code base and this is really important for how the model works and what context it sends and the the management of the context so I've assumed they've made some improvements there didn't notice huge difference in the projects I've been working on but good to see um the fusion model so they've improved the tabbing uh and to be honest I'm not using tab a whole lot I actually don't write any code I'm using the model to generate it I'm checking it I'm asking questions about it but I actually write very little code outside of pasting in um a couple of um API keys and different things like that so I can't really speak to if I've found a big difference in that and then an optional long context so if you go over to cursor again and then you click here and you click I think features and we scroll down here we've got the ability now to use a larger context window so it basically can use larger context Windows um but of course it's going to use more of your credits in doing so so my quick summary on deep seek and cursor definitely go and enable it and have a play it's interesting to use the or one model to see how it thinks and how it works for me I'm going to stick with clae 3. 5 for now I find it's a little bit faster and the fact that it's integrated with the agent is still a GameChanger for me I don't like using the normal mode and composer I've really gotten used to using agent so I'll be sticking with 3.