In today’s fast-paced tech landscape, beginner coders often find themselves overwhelmed with the vast amount of information available online. They struggle to create a structured learning path, face challenges understanding complex concepts, and lack immediate, context-sensitive assistance when learning to code. This lack of personalized mentorship often leads to frustration and a slowed learning process.
CodeMentor is designed to bridge these gaps by offering an AI-powered, interactive learning platform that personalizes the learning experience for each user. With features tailored to beginners, it provides a structured roadmap, instant content generation, real-time code feedback, and mentorship—making learning both easier and more engaging.
Here are the features of the Project:
Feature Description: Live Coding Feedback in Code Mentor offers real-time, interactive guidance as users write code. The system analyzes the code continuously, providing instant feedback on errors and inefficiencies. When issues are detected, they are highlighted with clear, concise explanations. Additionally, AI-guided hints help users solve coding problems by providing step-by-step guidance without revealing the full solution. This feature aims to enhance the learning experience by quickly identifying mistakes and developing problem-solving skills.
2025-01-25.20-50-30.mp4
Feature Description: Quiz Generation and Analysis is an integral feature of Code Mentor that enhances the learning experience by creating quizzes based on the user's content. The system generates tailored quizzes that evaluate the user's understanding, providing detailed analysis and feedback on their performance. Additionally, users have the option to retake quizzes focused on previously incorrect answers, allowing them to reinforce their knowledge and address weak areas effectively. This adaptive approach ensures continuous learning and improvement.
2025-01-25.20-39-38.mp4
Feature Description: Adaptive Mentoring with Logical Reasoning is a key feature of Code Mentor. The AI tailors the learning journey based on the user's performance, employing logical reasoning to determine the next steps. As users progress, the system adapts the mentoring content to their needs, ensuring they build on their strengths and address any weaknesses. This dynamic approach provides a personalized and continuous learning experience, guiding users through their educational journey with precision and insight.
2025-01-25.20-34-59.mp4
Feature Description: User Preference-Based Problem Generation is a versatile feature of Code Mentor, designed to cater to individual learning preferences. The platform allows users to generate problems based on their specific interests and needs. Moreover, each problem's difficulty can be adjusted—leveled up or down—based on the user's preference, ensuring that learners are consistently challenged at the right level. This adaptive feature empowers users to customize their learning experience and progress at their own pace, enhancing their overall mastery of coding concepts.
Feature Description: This feature provides an interactive voice assistant that answers questions about coding challenges. Users can ask for guidance and receive verbal explanations and support, enhancing their understanding and learning experience. The AI Voice Mentor offers real-time, spoken feedback and hints, making it a valuable tool for hands-free assistance while coding.
Feature Description: Users can create personalized roadmaps for learning a specific programming language or concept, such as "I want to learn C++." The roadmap consists of a list of topics (e.g., syntax, loops, functions) tailored to the user's learning journey.
Stack Usage:
- Frontend (Next.js): Users input their desired learning path, and the frontend sends a request to the backend to generate the roadmap.
- Backend (FastAPI): FastAPI handles the incoming request, triggering the Roadmap Agent (powered by OpenAI Swarm) to generate a topic list for the selected language.
- Database (PostgreSQL): The generated roadmap (topics) is stored in PostgreSQL under the user's profile for easy retrieval.
- Prisma: Prisma ORM interacts with PostgreSQL to efficiently manage and query the user’s roadmap data.
Feature Description: Once the user selects a specific topic (e.g., Python loops), the platform generates detailed content, including theory, syntax, and example code.
Stack Usage:
- Frontend (Next.js): The user selects a topic, and the frontend requests content generation via the backend. The request includes the topic for which content (theory, syntax, example) needs to be created.
- Backend (FastAPI): FastAPI routes the request to the respective agents:
- Theory Agent: Generates detailed theory about the topic.
- Syntax Agent: Provides the syntax of the concept.
- Example Code Agent: Generates relevant code examples.
- Database (PostgreSQL): Each piece of content (theory, syntax, and example) is saved in the database, associating it with the user’s profile and selected topic.
- Prisma: Prisma is used to manage and query content stored in the database, ensuring smooth fetching and updating of content.
Feature Description: The platform provides a code editor (Monaco Editor) where users can write code. They can select a portion of the code and ask specific questions. The system will process the query and provide feedback, helping them learn.
Stack Usage:
- Frontend (Next.js + Monaco Editor): Monaco Editor is embedded in the frontend, enabling users to write and edit code. The user selects a portion of the code and submits a question.
- Backend (FastAPI): FastAPI handles the query by sending it to the Mentorship Agent from OpenAI Swarm, which processes the selected code and generates an answer.
- Database (PostgreSQL): The Q&A interaction (question and response) is saved in PostgreSQL, allowing the user to refer back to it later.
- Prisma: Prisma is used to save the user’s code snippets and interactions into the database for future retrieval.
Feature Description: Users can ask questions about specific parts of their code. The system identifies the selected code, processes the question, and provides relevant guidance, explanations, or suggestions.
Stack Usage:
- Frontend (Next.js + Monaco Editor): The Monaco Editor allows users to highlight a piece of code and ask a question about it. The request is sent to the backend.
- Backend (FastAPI): FastAPI handles the request and passes it to the Mentorship Agent for analysis. The agent responds with feedback or clarification on the selected code.
- Database (PostgreSQL): User-generated questions and the feedback received are stored in the database for reference and can be revisited by the user later.
- Prisma: Prisma is used for efficient database operations, ensuring that questions, responses, and code snippets are stored and retrieved smoothly.
Feature Description: Users can create accounts, log in, and store their learning progress, including roadmaps, content, and interactions. Each user has a personalized profile to track their learning journey.
Stack Usage:
- Frontend (Next.js + Clerk): Clerk is used for handling user authentication. It provides a seamless login, registration, and session management experience. The frontend integrates Clerk's API to authenticate users and manage their sessions.
- Backend (FastAPI): FastAPI processes authentication tokens provided by Clerk for secure access to user data.
- Database (PostgreSQL): User profiles, credentials, and learning data (roadmaps, content, Q&A) are stored securely in PostgreSQL.
- Prisma: Prisma ORM is used to interact with PostgreSQL, ensuring efficient data retrieval and updates related to user profiles and learning progress.
CodeMentor.1.mp4
Technical Documentation.
Demonstration Video.
Before you start, ensure that you have the following installed:
-
Clone the repository:
git clone https://github.com/SuMayaBee/CodeMentor.git cd https://github.com/SuMayaBee/CodeMentor.git -
Create a
.envfile in the project root directory (wheredocker-compose.yamlis located) and add your OpenAI API key:OPENAI_API_KEY=your_openai_api_key_here
-
Build the Docker images:
docker compose build
-
Start the services:
docker compose up
The frontend and backend will now be running. You can access them at the URLs specified in the
docker-compose.yamlfile (commonlyhttp://localhost:3000for the frontend andhttp://localhost:8000for the backend).
- Ensure that your
.envfile is correctly placed and contains a validOPENAI_API_KEY. - Check that Docker and Docker Compose are correctly installed and running on your machine.
- If you encounter issues, try stopping the containers with
docker compose downand restarting them withdocker compose up.
We welcome contributions and feedback! Feel free to open an issue or submit a pull request to improve the project.







