How I Built an AI App to Review YC Applications in a Day

Judy@webdecoded
6 min readAug 30, 2024

--

YCombinator is doing first ever Fall batch this year. Per their website, they introduced fall batch due to high demand and because the deadline was 8/27 there were a number of people applying and looking to get a quick feedback on their applications. As someone who has personally been following YC content and a journey of their startups, I knew where to look for content when it came to finding useful tips, do’s and don’ts and things that needed to be highlighted when applying to the accelerator.

After collecting the links of it’s public data(YouTube and the ycombinator website), I created a new project on CustomGPT and added sources that I would later use to build my custom AI agent on. Since the data is too large for a context window for most AI models it would not have worked if I passed it as a single prompt. What I could have also done to provide access to AI to the data is to scrape it and store in a vector database and then when running queries, for example “can you review this section about equity?” it would do a smart search on the database find the knowledge related to the equity and return an answer based on the allocated information. If you want to build vector databases this way check out my tutorial. Although this part would have taken extra steps since with the video content would need to generate transcriptions first and than create embeddings to store in the database and because with CustomGPT I could skip these steps I went with it.

After adding YCombinator’s website as a source to the project, I noticed it scraped some general information pages but missed the profiles of companies from YC’s previous batches. These profiles contain valuable information — they showcase how companies describe their businesses, emphasize key points, and present their competitive advantages. Such insights are crucial for an AI assistant tasked with reviewing applications.

This meant I had to scrape the data myself, so I fired up a Google Jupyter notebook to write the logic for gathering the data, creating a file, and storing it in Drive. From there, I could later download and add it to my project in CustomGPT.

The first step in the notebook was installing the necessary packages.

After this, I wrote a function to generate an array of company links from the Startup Directory page.

YC Startup Directory

I ran the function but noticed that it only returned the links from the header not the company section 🤔

I realized that upon the initial page load, only some HTML containing the header was available. The data related to companies was likely being loaded from the server asynchronously. To address this, I had to update the function to wait until that data became available. Additionally, since I didn’t want to load every link on the page, I inspected the companies section in my browser and specifically targeted that part of the page.

Now we have an array of the links we want to get data from!

Next things we need is a function that will scrape them so we will use it to loop over the items in the array and write it’s contents to a file in drive. Here’s the result:

crawled_content.txtbecame one large text file. I initially wondered if I needed to format it before passing it to the AI. However, once I added it to CustomGPT and asked a few questions in their chat, it turned out that the AI had developed a pretty good understanding of the text without additional formatting.

Inside the Chatbot Persona of the settings, I added the initial instructions to define what the role of the agent is: You are a custom chatbot assistant that is responsible for reviewing startup applications for YCombinator and helping founders update their applications and give suggestions on things and wording they have to change.

This would be equivalent of passing the first prompt before sending the rest of the messages to the chat:

const completion = await openai.chat.completions.create({
messages: [
{"role": "system", "content": "You are a custom chatbot assistant that is responsible for reviewing startup applications for YCombinator and helping founders update their applications and give suggestions on things and wording they have to change."},
],
model: "gpt-4o-mini",
});

Now it’s time to build the UI that will utilize the CustomGPT API endpoints: “create a message” and “send a message.” For this part, I generated API keys and set up a NextJS project. I initially wondered if NextJS might be overkill — perhaps I should have just created a React app with Vite, given that I only needed a simple form. However, to securely interact with the API, I needed NextJS Routes.

Every time a user submits their application information for review, I create a new conversation. The details aren’t stored anywhere — they’re simply sent to the AI to fetch a response.

After creating a conversation, I send requests to the “send a message” API to get suggestions on specific parts of the application. These requests include both the question from the form and the user’s answer. I selected questions from YC’s application form, but I didn’t include all sections since some contain personal information.

For form components like Input and Card, I used shadcn to minimize styling efforts. I installed the components and adjusted the Tailwind configuration to match YC’s color palette. To achieve this, I pasted YC’s main hex color into the ZippyStarter. This generated configurations which I then added to my globals.css file. I only needed to tweak the background color slightly, as the theme generator’s shade differed from what I wanted.

And this is the end result:

yc-interview-ai.vercel.app

If you found this walkthrough on building a custom AI agent for YC application reviews helpful, I’d appreciate a retweet!

Want to stay in the loop on more AI tools and tutorials? Subscribe to my YouTube channel, where I share courses on building full-stack projects.

Thanks for reading, and best of luck with your YC applications!

--

--