How can we harness AI to make lengthy tasks more efficient and improve performance while still maintaining our own voice and not obstructing our learning? As AI becomes more and more advanced, Lakeside has taken steps to find an answer to this question, formulating guidelines specific to models like ChatGPT and other prominent AI programs. The college counseling department is no exception.
The utilization of AI in college counseling and admissions is foreign to a process that, in many ways, has remained largely unchanged since the mid-19th century. Across the nation and the world, there have been discussions about ways to incorporate different forms of AI to increase the efficiency of the college counseling process in an ethical way.
At Lakeside, students are often encouraged to use AI to decrease word count and help reword sentences in their college essays. Ari Worthman, Lakeside’s director of college counseling, also notes that as the technology advances, it could prove to be beneficial for assisting with generating college lists or for compiling information about students from their quarter comments to make the process of writing recommendations much more efficient (as opposed to college counselors going through each quarter’s comments for every student).
However, there are also many ways in which AI can be harmful in college counseling. For example, platforms like ChatGPT can strip student essays of crucial elements such as voice and personality. We are then left with important questions to address: where should the line be drawn, and how can college counselors harness AI in ethical and beneficial ways? Their response to these questions is Addie.
Addie is an AI program made by Andrew Howard with an original intention of helping high school students get admitted to the college of their choice. Before starting Addie, Howard worked on several AI-focused projects, including Wowsers, a company that created AI-based math video games. He then started building Addie after having conversations with college counselors like Mr. Worthman. The general consensus between counselors at small schools like Lakeside and at large public schools was that they wanted more time with students to understand them and help them make the most beneficial decisions for their future. Unlike many AI programs that have been built specifically for assisting with student essays, Howard created Addie as a way for counselors to get to know their students better in a way that is efficient for both the student and the counselor.
The class of 2026 was the first at Lakeside to be introduced to Addie. Prior to its implementation, juniors were required to fill out an extensive survey for college counselors to learn about each student before beginning the college counseling process.
Mr. Worthman notes that the goal of Addie was to make the survey process “more efficient and feel a little bit less cumbersome” as well as to “get students to think a little bit more deeply and make sure that some of the information we’re getting from the questionnaire is going to be as useful to us and as useful to them later on in the process.”
Current seniors were encouraged to use Addie, which was in the form of a chatbot at the time, after completing the questionnaire as a way to strengthen their previous responses. However, the feedback that Mr. Worthman received was not what he had hoped, with students frustrated that the three-to-four-hour process of completing the questionnaire had doubled overnight.
After receiving these responses, the college counseling team worked with the developers behind Addie to transform it into the current product. Now, using Addie, students can have an over-the-phone conversation, turning a three-to-four-hour process into ten-to-fifteen-minute phone calls that they can have throughout their junior year. Mr. Worthman adds that “Addie is easy to talk to [and] makes them reflect on things and on questions that hadn’t been asked about themselves before.”
According to Howard, these conversations tend to be about 5,000 words long, which is significantly longer than what students would generally type. He emphasizes that, after many conversations with Addie, students will have “written books” about themselves, especially when combined with quarter comments that they have received throughout their high school career (generally about 50,000 words in total).
Howard also noted that this can save counselors considerable time when writing recommendations or generating a list of majors or possible career paths, and overall, helping with “things that would be really difficult for college counselors to do in a more scientific way.” For students, Howard hopes that Addie can help them “start to find the path where they will thrive” and can “make sure your story doesn’t get lost.”
Though Addie is currently only being used in certain schools by students who are part of “pilot teams,” Howard anticipates its popularity will grow globally, noting that “students in Seattle are using it on their iPhone 17 in English, but Addie works on a flip phone in Sub-Saharan Africa to a student speaking Swahili.”
Some students, however, are not in support of the use of Addie. One student in particular, Kellen H. ’26, was apprehensive when Addie was first introduced and decided to look into the tool by asking Addie different questions to understand what was powering the technology and how it responded to certain prompts. Through these tests, Kellen found that Addie had functional issues, such as not being able to remember and repeat questions that he had just asked.
In addition to the functionality issues that he discovered, Kellen also had several privacy concerns. For example, Addie’s website promises that the technology follows regulations, including FERPA (the Family Educational Rights and Privacy Act), to manage student data. However, Addie is powered by ChatGPT, a platform that is not compliant with FERPA. In Kellen’s 14-page write-up on the subject, he noted that it was “extremely concerning for sensitive data to be leaving official networks,” though “the API [a tool that lets the website talk to ChatGPT] claims that data is handled differently for these professional applications.”
Furthermore, Kellen received an email from Andrew Howard after he asked Addie to write him a cupcake recipe as a way of proving that the platform “is not really a counseling AI…it’s just a chatbot with some instructions.” This also alerted Kellen to the possibility that Howard or another employee was “manually reviewing messages,” which seemed to him a “gross invasion of student privacy.”
After hearing Kellen’s concerns, the college counseling team took steps to further ensure that Addie was secure, having Lakeside’s cybersecurity consultants audit the platform to “make sure they were compliant with all the standards that Lakeside has for all of its technological platforms … Addie passed with flying colors.” This audit was done through Ankura, a cybersecurity firm that Lakeside has been working with for the past four years. Ankura supplies a questionnaire for the platforms that use student data to fill out. This questionnaire covers several security-related topics, though it does not address FERPA compliance. It is worth noting that Lakeside is not required to adhere to FERPA regulations since it is an independent school and does not receive federal funding. However, they do still aim to comply with FERPA and have also chosen only to share student data that is necessary for Addie’s functionality.
Mr. Worthman recognized that there are many unknown variables related to AI, and it can be difficult to give information to a foreign form of technology. He says that the concerns shared with him from parents and students, such as Kellen, didn’t concern him “in the sense that the concerns, in many places, were driven from irrational fear. But that irrational fear was something I knew was out there, and I should have responded to that at the outset and not presented it in such a way, which made the fear even worse.”
He also commented that the amount of student data held by Addie is significantly less than the amount of information that other commonly used companies like Google have on students. “If a student uses Google,” he noted, “and they’re concerned about information being accessible to company employees, Addie should truly be the least of their concerns.”
Similarly, Andrew Howard noted that as Addie has developed, its privacy has “gone from being a bug to a feature.” On the website, the company states that it has been audited by Common Sense Privacy and confirmed to be FERPA-compliant because individual students and parents can control the data that they give to Addie. According to Howard, other than counselors, only Addie employees have access to student data, and they “don’t have time to sit around and read … novels and novels on every student.”
However, Kellen does not see the issues as resolved, remarking that “there’s not much of a product, even now.” He hopes that students who choose to use the technology know that “anything you send to Addie can be, and very possibly will be, seen by a non-Lakeside person.”
Like Kellen, many other seniors have continued to avoid using Addie, and some have refrained from using AI overall. In the Tatler poll, one senior wrote that they “have not touched Addie with a 10-foot pole because of the security concerns and the college counseling office’s rash decision to adopt it.” Another wants “the satisfaction of knowing that I got into any of the schools I got into without help from AI.” Some have chosen to use other resources like ChatGPT and Claude to review essays, research schools, or provide questions to ask admissions officers at college visits.
Some juniors, however, were given the opportunity to interact with Addie as part of a “pilot team” that communicates and gives feedback to the developers, including Andrew Howard. One student on this team, Elizabeth W. ’27, decided to join the pilot team because it felt “easy and approachable” and a way to skip the “exhausting process of having to fill out a bunch of questionnaires.”
For Elizabeth, conversations with Addie have been about “exploring your interests or finding a major that’s the right fit for you” with “very, very open-ended” and “future-oriented” follow-up questions.
While Addie is often glitchy and sends excessive notifications, she notes that the developers have been very responsive to feedback from these pilot groups. Addie also has a feature that can suggest careers and majors for students, though she describes how it “kind of scammed [her] over,” as the recommendations she received did not align with her current interests.
In terms of security, she considers conversations with Addie as similar to an interview as opposed to a discussion with a friend or even a college counselor in terms of formality and recognizes that she wouldn’t share very personal information with it. As a whole, Elizabeth and other juniors on the pilot team have largely enjoyed using Addie as a way of bypassing the lengthy questionnaire process, although there are still flaws in the technology.
Overall, while the college counseling team is confident in the security and functionality of Addie, many students have concerns and have chosen to avoid the tool. However, the technology behind Addie is continuing to improve and adapt to the needs of students both at Lakeside and around the world.
So is Addie an important tool to simplify the complex and arduous task of college applications? Or is it risking your privacy and bypassing a process that is crucial for your college counselors to really understand who you are as a person, so that they can best help you find a path on which you will thrive? It’s up to the readers to decide.
