Three Decades in Software. Twelve Years in College Counseling. Seven Weeks to Build Something That Combines Them
A couple of weeks ago, I was on a Zoom call with a student I’ve been working with for a while. We’ve gotten to know each other, and the conversation wandered, the way good conversations do. We both love (and love to create) music. He mentioned something he was working on, so I asked him to play it for me (he was next to his keyboard). That reminded me of something, and I shared a piece I recorded, which you can find on YouTube. He was amazed I did this too, and it was one of those moments that doesn’t fit in a session report. Every conversation since has been deeper because of that moment.
I’ve been thinking about that conversation since, because it’s the reason I’m building what I’m building, and maybe also the reason I almost didn’t.
The Question Worth Asking About AI and College Counseling
Mark Sklarow, who led IECA for many years, recently wrote about whether AI will replace college counselors. His answer was that it wouldn’t, and I agree. But the more interesting question is what AI makes possible that wasn’t possible before. AI will undoubtedly transform the work we do, as it is transforming medical diagnosis, financial analysis, and, horrifyingly, even war.
I’ve been interested in artificial intelligence since long before my students (and many professional colleagues in this work) were born. In the 1980s, as a graduate student at the University of Illinois, I worked with Professor Narendra Ahuja on detecting edges in images — teaching computers to see boundaries that human eyes perceive instantly. There’s something amazing in thinking about how easily we perceive things, walking into a strange cafe in an unfamiliar city, and deciding that the bundle of light and dark in a corner is a beautiful woman drinking coffee.
What AI Actually Makes Possible for College Counselors
I spent decades in technology after that, building large telecommunications systems, and then spent the last twelve years as a college counselor. As far apart as they are, the two worlds have never felt separate to me. So when large language models arrived, I paid attention. I incorporated AI into our platform at LifeLaunchr, building college profiles that answered not just quantitative questions about admissions rates and average GPAs, but qualitative ones about culture and feel.
One way of thinking about the use of AI is that it will make us more efficient: speeding up things that took a while, like a spreadsheet on steroids. But that isn’t really it. You don’t need AI to do the work we do, and some counselors will create unique, successful practices without it, just as fine furniture gets made by hand and beautiful cloth gets woven on handlooms. The tools don’t determine the quality of the practitioner.
Three Scenarios That Changed How I Think About This
Arrive at Each Meeting Fully Prepared
But AI changes what’s possible. Imagine this scenario. You’re meeting with a student today at 4 PM. To prepare, you read all your old notes, review the student’s college and scholarship list, and put together an agenda. It might take a while, but more importantly, you might miss something. And you have no idea what the student did since your last meeting. Teens are famously reluctant to share. Now imagine simply clicking a button and having the tool review everything – old notes, college lists, activities lists, summer program lists, and agenda items that would be common in the month of March. Even more important, imagine that the tool reviews what the student did since the last meeting, and uses that to prepare a meeting brief. How much better prepared can you be?
During the meeting, you take notes but concentrate on your conversation. Afterward, you paste in an automated transcript, and you have session notes nearly ready to send that combine what you wrote with the transcription.
Guide Your Students Even When You’re Not With Them
Here’s another example: Counselors struggle with getting kids to research colleges, scholarships, and summer programs, and part of it is that teens don’t know how. All your presentations and webinars fall on deaf ears. But imagine a tool that talks to your students when you’re fast asleep, and guides them through the process as you would. It’s like an assistant who’s available 24×7, but imbued with your philosophy and approach. And it can base its suggestions on everything it already knows about a student: their extracurricular interests, academic performance, and the list of colleges they’re researching.
Find the Great Colleges and Majors You Didn’t Know About
Or what happens when a student says, “I like university X. Can you suggest others that are like it?” You can draw on your knowledge and ask colleagues. Nothing replaces what you know when you put your feet on a campus’s grounds, of course, but sometimes you need more. A good AI system doesn’t replace your judgment, but it can know more. And when it’s built right, it can ask the questions that drive the student to think about what they want, and how it connects to that. “Do you like university X because of the collaborative culture, its stellar honors program, or is there something else you find interesting?”
What I Built in Seven Weeks, and Why “I Built an AI Tool” Doesn’t Cover It
So about seven weeks ago, I decided to build something. I had my normal caseload, my normal family responsibilities, and a vacation in the middle of it. What I built during that time surprised even me. At home, the jealously-uttered phrase, “You spend more time talking to Claude than to me,” has become common (it may well be true :).
I want to be precise about this because “I built an AI tool” has become nearly a meaningless sentence. Along with the amazing Vinay Narang, I co-lead the AI Learning Lab at HECA, so I’ve spent considerable time thinking about what AI should and shouldn’t do in this field. What I built is a production application: three-tier architecture, a responsive interface that renders correctly on every device, a downloadable app for phones, top-quality user authentication, and three distinct user types: counselors, parents, and students, each with their own experience. It goes well beyond a chatbot.
I don’t say that to impress you (ok, maybe I do), but because the speed and quality of what these tools make possible is something every counselor needs to understand viscerally. My years of software expertise were critical because this orchestra needs a worthy conductor. The tool writes code, suggests architecture, and can talk to you as a great software team lead would, but you have to be able to handle the other side of that conversation: know enough to know which ideas seem right but are profoundly wrong.
I decided to name it Soar. And as I went through the process, I realized that it wasn’t a simple tool to build. AI raises important questions about ethics and principles that require considered decisions.
For Essays, Brainstorm, Don’t Write
Essays are the clearest case. Last fall, I wrote a piece summarizing the state of play on AI in college admissions, and I will update it this fall. The bottom line is that colleges either discourage or forbid AI-generated essays, not just because of concerns over cheating, but because an essay written by AI isn’t the student’s voice, and a student’s voice is precisely what the admissions process is trying to hear. So when I built Soar, I added instructions to tell it not to write essays for students. That’s a hard constraint, not a suggestion.
Overestimated Likelihoods Hurt Students
Admission likelihoods are another. ChatGPT, Claude, and Perplexity will sometimes overestimate a student’s chances at a college to tell them what they want to hear. That’s a genuine problem when families are making consequential decisions. Soar uses an algorithm derived from my experience as a counselor, and lets individual counselors override it based on what they know about a specific student and their plans. The point isn’t to have a perfect algorithm: it’s to be honest about what the data supports and where human judgment should take precedence.
Fact-checking is Paramount
The third issue is hallucinations. General-purpose AI tools make things up, and in most contexts that’s an inconvenience. In this context, a student might make a college decision based on fabricated information about a program that doesn’t exist or a scholarship with wrong eligibility criteria. I built Soar on a foundation of real data: 1,800+ colleges, 6,700+ scholarships, and 250+ summer and enrichment programs. I spent time gathering data from reliable sources and came up with ways to keep it current. Young people are making consequential decisions about their lives. They deserve accurate information.
What This Is Really About
I think about these questions the way I think about any powerful tool: the responsibility isn’t to avoid the tool, it’s to use it wisely. Fire keeps us warm. It can also burn. The difference is the person holding it.
I’m writing this because I think counselors need to engage with AI seriously: not because those who don’t will be left behind, but because these tools genuinely expand what we can offer the students and families who trust us. The Zoom call with my student, the music, the moment of actual connection — that’s what we’re trying to protect. Everything else is in service of that.
If you’re curious about Soar, I’d love to show you. Drop me a line, and I’ll send you an invite. LifeLaunchr students will receive an invite soon, and all your data will be migrated over so you can use our new tools to replace the current platform. If you’re a counselor, it’s designed to work alongside the tools you already use — CollegePlannerPro, Maia Learning, or whatever your current system is. It’s not a replacement for anything. It’s the thing that works in between. And you can start for free.







0 Comments