Bolt.New + Ollama: AI Coding Agent BEATS v0, Cursor, Bolt.New, & Cline! - 100% Local + FREE!
Science & Technology
Introduction
Recently, there's been excitement surrounding Bolt.New, a revolutionary tool that enables users to create full-stack applications without writing any code. With just a prompt for your requirements, you can easily run, edit, and deploy full-stack applications using the language model (LM) of your choice. Our tests have shown the best results when using Cloud 3.5 Sonic, which is noted for its ability to generate modern, fully functional applications.
In previous videos, we explored an open-source fork of Bolt.New and demonstrated how it synergizes with AI-driven code assistance. I highly encourage you to check those out (links can be found in the description below). Because of your continued requests for integration with Ollama, today's showcase will detail how to connect the open-source Bolt.New fork with Ollama.
For those unfamiliar, Ollama allows you to run open-source models locally, which helps avoid costs associated with token usage while enabling access to various models based on personal preference, from Llama 3.2 Vision to Deep SE Coder. While Cloud 3.5 Sonic is still an excellent choice, using these open-source models offers comparable quality without the expenses typical of cloud-based options.
Installation Prerequisites
Before diving in, make sure to meet the following prerequisites:
- Ollama installed on your operating system.
- Git installed to clone the repository.
- Node.js installed.
- Docker installed (optional, as you can run Bolt.New locally without Docker).
All necessary links can be found in the video description.
Installation Steps
Clone the Bolt.New Repository:
- Visit the Bolt.New GitHub repository and copy the URL.
- Open your command prompt (or terminal) and execute the command:
git clone [URL]
- Navigate into the repository’s directory with:
cd bolt.new-any-lm
Install Your Preferred Model:
- Visit Ollama's website and select a model you prefer. In this example, we will install the Quin 2.5 coder. Copy the relevant installation command from Ollama:
ama run [model]
- Run this command in the terminal to install the model, which may take several minutes.
- Visit Ollama's website and select a model you prefer. In this example, we will install the Quin 2.5 coder. Copy the relevant installation command from Ollama:
Setting up the Project
Open the Project in an IDE:
- Launch your IDE (e.g., VS Code) and open the cloned Bolt.New folder.
- Create a new folder named
model_file
.
Create the Model List:
- Right-click the
model_file
folder and create a new file namedmodel_list
. - Input the following lines into the file:
from [Model card] PARAMETER = NUM_OR_CTX = 32768
- Right-click the
Initialize Ollama:
- In the terminal, initialize Ollama by running:
ollama create -F [model_file_directory] [model]
- List installed models with:
ollama list
- In the terminal, initialize Ollama by running:
Running Bolt.New
You can choose to run Bolt.New either with Docker or without it. To run without Docker, ensure the pnpm
command is available and execute:
pnpm install
After installation, run your model with:
runev
This starts Bolt.New with the Quin 2.5 model on your local machine.
Showcasing Capabilities
Access Bolt.New through the local host link, and you will see various model providers available. Once you select Ollama, you can prompt it to perform various tasks, like creating a simple web page.
The model generates the necessary files and folders, and you can preview your project live. Though a 7-billion parameter model may yield basic outputs compared to Cloud 3.5 Sonic, it is still capable of generating useful code snippets.
For example, when asked to generate a snake game, the model proficiently created a fully functional game that tracks scores and is visually impressive.
In another instance, using the Cloud 3.5 Sonic model to create a full-stack application results in a functional financial dashboard application, complete with features for managing budgets and transactions.
Conclusion
In summary, integrating Bolt.New with Ollama is a straightforward process, enabling users to save on token expenses while still unlocking robust AI capabilities for local development. I encourage you to try this out, as it opens doors to various coding efficiencies and creative possibilities.
Stay tuned for more videos on this topic and feel free to check the description for helpful links. Don't forget to follow me on Patreon for access to free AI tools, and subscribe to stay updated on the latest in AI technology!
Keyword
- Bolt.New
- Ollama
- AI Coding Agent
- Full Stack Applications
- Cloud 3.5 Sonic
- Open-Source Models
- Local Development
FAQ
1. What is Bolt.New?
Bolt.New is a tool that allows users to create full-stack applications without writing any code by simply prompting their requirements.
2. What is Ollama?
Ollama is a platform that enables users to run open-source AI models locally, avoiding costs associated with token usage.
3. Why should I choose open-source models over cloud-based models?
Open-source models can provide similar performance without the costs of token usage that are typical with cloud-based models.
4. What are the prerequisites for installing Bolt.New with Ollama?
You need to install Ollama, Git, Node.js, and Docker (optional) on your operating system.
5. Can I run Bolt.New without Docker?
Yes, you can run Bolt.New without Docker as long as you have the necessary packages installed, such as pnpm
.