As artificial intelligence rapidly reshapes the landscape of higher education, students are meeting these changes with both optimism and a healthy dose of skepticism. They see the promise of AI to enhance learning, but they’re also raising critical questions about fairness, transparency, and trust.

WGU Labs recently surveyed over 4,500 students enrolled at WGU about their awareness, usage, and perceptions of AI in learning. The findings from this survey sent a clear message: AI tools must be thoughtfully designed and implemented to truly serve the diverse needs of learners. Innovation alone isn’t enough. To unlock the full potential of AI in education, institutions and EdTech leaders must center students in the design process — particularly those who are most at risk of being left behind.

Here are five strategies to help ensure AI-powered learning is effective, equitable, and student-centered.

1. Rapidly expand AI training and support to close gender-based confidence gaps.

AI tools are being used by students across all backgrounds, but not equally. Our research found persistent gender-based gaps in AI confidence, with men reporting significantly higher familiarity, comfort, and understanding of these tools than women.

These gaps represent an emerging equity issue that echoes past gender divides in the workplace. As AI becomes a core workplace skill, institutions must act now to close these gaps. That means embedding AI literacy across the curriculum, offering hands-on learning opportunities, and ensuring inclusive access to training, especially in fields where women are overrepresented, such as healthcare and education.

Policymakers and campus leaders should treat this as a priority. If women continue to face barriers to developing AI confidence, they risk being excluded from opportunities in an increasingly AI-integrated workforce.

2. Prioritize high-value personalization that aligns with student goals.

Students are most enthusiastic about AI when it helps them learn better, not when it simply automates tasks. Indeed, our survey found that roughly 59% of students were positive about the potential of AI tools and nearly two-thirds were enthusiastic about tools that personalize learning based on their progress, interests, and needs, such as customized assignments, tailored resource suggestions, or academic coaching aligned to career goals.

To meet this demand, institutions should prioritize AI solutions that provide meaningful personalization. This includes clearly communicating how student data is used and ensuring students have a voice in shaping how tools evolve. Personalization should feel helpful and human-centered, not performative or opaque.

3. Build student trust in AI-supported assessment by highlighting its potential for fairness.

Our survey found that students are cautious about AI-based assessment and evaluation. Only about a third trust AI to assess their work and believe that AI-generated evaluations reflect their skills accurately. These results suggest a need to build student trust in using AI in high-stakes decisions like grading and evaluation.

AI can play a supporting role in making assessments more consistent and equitable. For instance, using AI in formative assessments, like practice quizzes or draft feedback, can help students become more comfortable with the technology while improving learning outcomes. Over time, this exposure may build trust in how AI is used for higher-stakes evaluation.

4. Use AI to enhance student access to support services.

Students in our sample were wary of AI in deeply personal contexts like mental health or emotional support. For example, less than a third (32%) said they believed AI would be beneficial for emotional support or mental health guidance. But they’re far more open to AI when it comes to academic (65%) or career help (59%) — particularly in the form of chatbots that offer around-the-clock guidance.

This distinction matters. Students aren’t rejecting AI outright — they’re responding to how it’s introduced and what kind of support it offers. Institutions should use AI to expand access to timely, low-stakes support while continuing to invest in staff who offer the human relationships students value most.

Done well, AI tools can help institutions reach more students with just-in-time help, freeing up human advisors to focus on building trust and community.

5. Communicate clearly and consistently about AI use to build student trust.

Above all, students want clarity and transparency. Ninety-two percent of students in our survey said it’s important to know when they’re interacting with AI, but nearly half weren’t sure when it was happening.

Transparency is essential to building trust. Institutions should clearly label AI-generated content, provide opt-out options where appropriate, and ensure students always have access to human support. They should also be upfront about how data is used and decisions are made. Without this clarity, even the most promising tools can quickly lose student confidence.

Innovating with intention

Students aren’t asking institutions to stop AI innovation. They’re asking us to move forward with intention — to close confidence gaps, design for real learning impact, and communicate clearly every step of the way.

By listening to students and responding to their concerns, higher education can harness the power of AI not just to scale learning but to shape a more inclusive, equitable future.