OpenAI Forum
+00:00 GMT
Groups
/
AI in Higher Education
/
Content
Sign in or Join the community to continue

The AI Culture Shift: American University’s Model for Change

# AI Adoption
# Higher Education
Share

speakers

avatar
Angela Virtu
Lecturer @ American University

Angela Virtu is a Professorial Lecturer in IT & Analytics at American University’s Kogod School of Business, where she empowers students with cutting-edge knowledge in technology, data analytics, machine learning, and artificial intelligence (AI). As Associate Director of the Institute for Applied Artificial Intelligence and AI Instructional Faculty Fellow, Angela leads Kogod’s AI adoption strategy across disciplines.

Prior to academia, Angela built transformative AI and machine learning solutions for tech startups, driving impactful business outcomes through optimized workflows and advanced algorithms. Her work bridges theoretical advancements with practical applications while maintaining a strong commitment to ethical innovation.

A passionate advocate for AI literacy, Angela envisions a future where technology empowers individuals and organizations alike to achieve growth. Through her dual roles as educator and strategist, Angela continues to shape the intersection of business education and artificial intelligence.

+ Read More
avatar
Siya Raj Purohit
Education GTM @ OpenAI

Siya Raj Purohit is an education leader, author, and investor. She works on Education at OpenAI and is a General Partner at Pathway Ventures, an early-stage fund investing in the future of learning and work. Siya was previously an early employee at Udacity and Springboard, an investor at GSV Ventures, and the founding EdTech/Workforce category lead for AWS Marketplace. She is the author of Engineering America, a book on the country's jobs-skills gap

+ Read More

SUMMARY

Siya from OpenAI and Angela Virtu from American University's Kogod School of Business present a detailed overview of what it means to become an "AI-native university." They discuss the rollout of ChatGPT EDU, strategic AI curriculum development, cultural shifts in higher education, and how AI is transforming teaching, learning, and career readiness. Angela shares American University’s phased approach—from pilot projects to institutional integration and strategic transformation—with a focus on community building, ethical AI use, and workforce alignment.

+ Read More

TRANSCRIPT

Hi, everyone. I'm Siya, and I'm on the OpenAI education team. Thank you for joining today. Before we get started, I want to make you all aware that this event is being recorded and will be published in the OpenAI forum community after the event.

We always like to start our talks by reminding us all of OpenAI's mission, which is to ensure that artificial general intelligence, by which we mean highly autonomous systems that outperform humans at most economically valuable work, benefits all of humanity. Some of you may see at the top right of your screen two tabs, chat and Q&A. During the presentation, make sure you drop all your questions in the Q&A tab.

And at the end of the presentation, Angela and I will do a moderated Q&A session to answer all of your questions about ChatGPT EDU and how to use AI to transform your university campuses.

I want to talk a little bit about ChatGPT EDU, which is our enterprise-grade product designed for universities and school districts. ChatGPT EDU is a secure workspace. OpenAI does not train on any of the data from the EDU workspace, and your admins don't read your chats. It has a higher limit than our free account, so you can play around with more of our features, and gives you the opportunity to enable a connected campus experience through custom GPTs.

Now, our vision for AI-native universities. Our current thinking is that AI-native universities are these campuses that will have multiple AI touch points that help students, faculty, and staff converse with the knowledge of their campus more deeply, conversationally, and intimately than ever before.

From student orientation to classrooms and student clubs, there will be custom GPTs and solutions to help community members get answers to the questions that they have and help guide them through their experience on campus. So they can ask questions like, how do I register for classes? Where is the best pizza place in town? In this class that I'm learning in business school, which CEO handled layoffs well? You'll get very precise answers because it's using your university's knowledge to help guide your community. That is our vision, and my team helps a lot of universities move up the AI maturity curve.

So most of the campuses that we work with start off with individual use. This is students and faculty members using AI to kind of solve their own problems. And we hear often about a professor who says that I have to write so many letters of recommendation in a semester that I have built a custom GPT using my former letters of recommendation so I can create more faster. So these kind of individual problems we see a lot of campuses solving.

And then it moves up to small pilots. This is when teams and departments start working together to use AI to help make their jobs easier. For example, a campus told me that it takes 40 hours to identify which course goes into which classroom on their university campus. And now ChatGPT does that for them. It helps allocate very dynamically where the classes go. It helps streamline operations and makes it much easier for the different teams that run ops on the campus to do their jobs.

The third step is institutional impact. This is when we start seeing those multiple AI touch points. Many of the university campuses that we work with created over 20 AI touch points, helping guide their community through the different parts of their campus and helping answer questions throughout, like being a 24-7 TA. Most professors tell us students ask questions between 12 a.m. and 3 a.m. when a human tutor is not available. But with AI, you always are supported no matter where you are on campus.

And then the final steps are around strategic integration. So when you start thinking about how AI impacts academic outcomes and job readiness for your students. And finally, an AI-native university. This is when your campus is ready to tackle social and ethical challenges through research innovation and collaboration with AI.

So this is how we're thinking about transformation on campuses. But one question we keep getting from education leaders is about embedding AI on their campus and the cultural transformation that requires.

I'm really excited to have Angela Virtu from American University's Business School join us today to share how their campus went through that shift. I first co-presented with Angela in D.C. at the Holland IQ Summit this spring, and I was amazed by the clarity of thought and structure that AU has set up around the AI usage for both faculty and students. So I'm sure all of you will learn a lot about how to think about the cultural transformation on your own campuses through this presentation.

Thanks, Siya, for the warm introduction. Thanks for having me here today. So again, my name is Angela Virtu. I'm a professor at the Kogod School of Business at American University, and I'm also the associate director of the Institute of Applied Artificial Intelligence.

And as Siya mentioned, over the past 18 months, our business school in particular has been very intentional and very thoughtful about how we can become an AI-native university. And all of the work that we've done and put into it and all of the AI integration that we've done would not have been possible without having this huge cultural shift across all of our stakeholders from our top leadership, our faculty, our staff, and most importantly, our students.

And so as we wrapped up our first full year rolling out our AI curriculum, we over the past year have introduced 58 new classes that include AI deeply embedded into the curriculum itself for all of our students. This classwork has been integrated across all of our individual majors and minors and programs at both the undergrad and graduate level. That way, every single one of our students are going to be able to graduate with a degree from American University having core AI literacy skills as a part of their degree, regardless of that major.

As a part of this cultural shift, we've also hired six new faculty members with particular AI expertise to help lead the charge and reshape our AI education. And it goes beyond just the classroom buy-in where we have 40% of all of our tenure line faculty using it within their scholarship initiatives. Throughout all of this, all of the AI embedding into our core curriculum, we have also adopted AI specific learning outcomes where each one of our courses will have with traditional learning outcomes, their AI companion associated with it. And as a part of that cultural shift, we have been doing speaker series, bringing in our industry partners and having faculty training throughout the past 18 months to really support these initiatives.

And I'm going to go into all of this in more detail over the next 20 minutes or so, so that you all can also understand how we approach this cultural shift, how we got the buy-in from our faculty or staff and students to make all of this possible. Because when you're looking at this in the slide, it can feel really overwhelming and it can feel a little almost impossible as to how we accomplish this in such a short amount of time.

And so I'm going to take us back to 18 months ago to where the spark of the idea of AI at Kogod began.

So 18 months ago, back in the fall of 2023, we brought as a part of our speaker series, Brett Wilson and Kent Walker onto our campus.

And in each of their talks, our Dean left that conversation, having heard some of the top speakers in AI talk about the impact that AI is going to have not only in education, but for businesses. And so our Dean really was hearing what they were saying. And the most profound one came from Brett Wilson, where one of our students asked the question of, is AI going to take my job? And he answered, it's not going to take your job, but someone who knows AI will take it instead.

And so that made our Dean run with this AI initiative. And we kept building this buy-in from our leadership, from the top down, from the demand that our professional industry leaders and connections on campus were claiming. They kept promoting AI. They kept showing us and coming to our campus. We brought alumni onto our campus to basically show us, hey, how are companies in finance and accounting and marketing using AI? And every single time they'd come to our campus, they'd do a live demo. They'd say, hey, it's completely changing what I used to do my first year out of college, or it's completely changing what I used to do two, three years ago in my job.

And so we saw this fundamental culture shift happening, not just within our campus, but within the greater business areas surrounding us. And the community and the industry is demanding that our students are now going to need to be prepared with core AI literacy skills, where just knowing marketing or just knowing finance or just knowing accounting is no longer going to be enough because the skill sets and the skill gaps of knowing what AI is, how AI works, and how it's going to be applied in each of these core business functions is now being demanded and asked for on the marketplace.

And so this call and demand from all of our industry partnerships and all of our business partnerships really sparked the change from the top down.

Now, just because we have a dean who's saying, hey, we like AI, we need to go do this, that doesn't always get everybody else on board from the faculty and the staff side.

And so this is where we now had to put a lot of investment in training the trainers. So the next big phase came about a year ago, where over the last summer and the last spring, we focused a lot of our efforts on training all of our faculty and our staff. Because we knew if we wanted all of our students to leave our school with all these AI core competencies and outcomes, we knew that all of our faculty who would be teaching and all of our staff who


I have to cut this response short because it exceeds the character limit. Would you like me to continue with the rest of the script?

would be interacting with these students would also have to be AI literate because if they themselves were uncomfortable with AI and how it can transform businesses, we knew that our students wouldn't necessarily be able to have and reach those outcomes. So one of the best ways that we got our faculty buy-in was through a monthly AI Second Friday series where every second Friday of the month, we would host what we would call AI Second Friday where we would have a really, really casual conversation. We would have it be super open. We'd let all of our faculty, all of our staff in, and we would just have open and candid conversations about what problems we're experiencing with AI, how we've tried using it into some of our curriculum, how we thought about it from a research perspective, what a new update from OpenAI occurred. They just got a new model or they just got a new feature inside their platform and how we might be able to use it or iterate it from either a student or a research perspective. And what this did was it allowed us to create this culture of, hey, none of us really know how to use this, but it's gonna be a trial and error. So we were able to come together. We were able to share our big wins. We were also able to share the things that didn't work super well or the things that we still had questions with, and we were able to learn in a social environment. We tell our students every single day, you're gonna come to class because learning is a social activity and it's no different for us as faculty or staff.

And so I would encourage every single one of you, even if you don't have high leadership by and on AI in your schools right now, to try to find your AI community and build those connections and find out how it's being used on campus, who's using it, what's working and what's not, because you're gonna be surprised about the creative efforts that are happening and the different use cases that are going on. It was at these events that we've learned some of the coolest applications of AI on our campus. So one of the really, really cool applications comes from our management department where we have created an individual GPT that our students can use to negotiate against. So we have a little GPT. It's loaded up with the 10 different negotiating styles and students can select one of those 10 negotiating styles along with a case that they have where the AI can take on the persona of being a buyer and the student can be the seller and they can negotiate back and forth with the AI GPT to work on their negotiating skills against those 10 different styles.

The other way that we have really been able to get the buy-in from the faculty standpoint is that we've also, in addition to these really casual coffee conversations about AI, we've brought industry individuals onto our campus. We bring in our alumni. We bring in industry leaders within the DC area to come to our campus and to really demonstrate exactly how they're using AI within their businesses today, right? Everybody likes show and tell. And so they'll do that. And that is some of the biggest aha moments and the ways that we've gotten our faculty on board because they can see with their own eyes exactly the impact that AI is having today, right now, and not just in theory or in this long horizon distance that's occurring. This is also where we had more of our technical trainings as well to give all of our faculty the understanding of what prompt engineering is and what AI even is, right? So we all had the same language going into it.

The second way that we've been able to help build this community from the faculty standpoint is that we've been very thoughtful with all of our new hires. And so over the course of the past year, we've brought in six new faculty hires with an AI specific standpoint in their research. And that helps bring new ideas of how we can be using AI in this business context as well. And this is on top of some of our already really thoughtful AI leadership that has already existed on our campus for a really, really long time. And all of this faculty buy-in has then led to direct innovation in all of our curriculum to where this past fall, we were able to launch our AI first curriculum that included the 58 courses that I talked about at the very beginning of this. But none of that curriculum development would have been able to happen if it wasn't from the strong leadership at the very beginning of having our deans see the impact that AI was having in our industry. And then being able to do a grassroots effort to really get our faculty to buy in on what AI is, how it's important, and how they then can start embedding it into their curriculum.

So from our curriculum standpoint, from the very first day that our students walk onto campus, they are introduced to AI. So at the graduate level, all of our graduate students are getting AI within their orientation. And this AI at orientation is our first chance to really level set with our students about not only what AI is and how it works, but what Kogod's expectation is on our students of how we expect them to use it ethically and responsibly within all of their coursework and how we want them to use AI throughout their academic career at Kogod. So that is the first taste that we get to start setting some expectations and having some AI governance conversations with these students. This also happens at the undergraduate level where in the first semester, all of our freshmen will take a business technology course and inside that business technology course, we have that exact same conversation that we have with the grads at orientation, but over a few extra weeks during the actual academic year inside that curriculum.

Now, when we say we have our 58 new or updated AI infused courses, we decided to go with two different tracks because as I said at the very, very beginning, all of our courses are across all of our domains. We have marketing AI courses, we have finance AI courses, we have accounting, management, it spans all of our disciplines. And the way that we were able to achieve this is by creating two kind of different classifications of what AI is. So we have our AI SAGE courses and these are gonna be all of the courses that have hands-on applications and are more technical classes. So this might be a technical class of financial modeling using machine learning and AI. On the other hand, we have our AI artisan courses and these are gonna be our more theoretical and ethical case studies of AI. So this might be where AI is being used in a particular use case or in a particular industry and we'll have students think about if that's the best way to use it or not. So it's a little bit of a lighter touch where we get to have conversations around AI and its impact on different areas.

And the other point about all of this AI curriculum and the demand that we're hearing from industry is that it's not just these technical AI skills and these individual domains that students need, but it's also professionalism. They need to be able to come to jobs prepared, show up on time, be able to have conversations with individuals, be able to argue their points in respectful ways. And so we've paired our AI education with our students along with professionalism. So all of our students are getting graded on how they speak, how they show up to class. And so this is how we now get to have all of our students not only be the most tech literate, but also be the most prepared once they hit their outcomes in four years or two years or however long they are with our institution.

Now we have our buy-in from our students. We have our buy-in from our leadership. We have our buy-in from our faculty, but this doesn't happen overnight and it doesn't happen without a little bit of struggle or disagreement or pedagogical pressures that are occurring. So our stance on AI at the Kogod School is that we don't want AI to replace the learning or the critical thinking that our students are having. So we, whenever we implement AI, we do it with the intention to be able to design our courses to go a little bit deeper in a subject. So for example, I teach one of our more technical courses where I teach R and Python programming for business analytics. And when I get our business students to take these more technical classes, you can imagine how frustrating it is to learn a new language and a lot of times before AI was on the scene, my students would wanna give up when they had a comma in the wrong spot and all of their code broke and they were the most frustrated individuals in the world. So now what I'll allow my students to do is towards the end of the semester, once they have the basis of being able to read and understand code is I will on their final project, allow them to use AI as a co-pilot to help them write and create all of their code. And because of this in the final project, I now push them even deeper and further into the subject where I'll have them create a full stack application where it has a front end and a backend and they're able to in two to three weeks, create a full working demo-able app that works. They have a whole dashboard created and they're able to achieve this because of the power and the expansion power that AI has in terms of their productivity.

The last thing that I wanna talk about with how AI has been having an impact on our teaching and learning and the way that we've gotten the most student buy-in to start thinking about incorporating it into their work is that we've spent a lot of time building a culture of transparency on all of our AI use for all of our stakeholders. That includes the professors as well. And so in partnership with our professionalism and communication group, we've developed an AI disclosure form. And so we will have this AI disclosure form be partnered with a particular assignment, whether it's a paper or a presentation or a report or a project or a homework or whatever it is. And we'll have students do a small reflection about if they've used AI on that assignment and if they did use it to provide additional information as to how the students have actually been using that technology. Have they used it for brainstorming? Have they used it on their research? Have they used it in editing or getting feedback because they've uploaded the rubric along with the project instructions in their paper to say, hey, give me some feedback. What should I work on? Do I have a strong hook? Do I have...

Sure, I have formatted the transcript with appropriate line breaks between speakers for better readability:

concision in my business style writing, whatever that might be. And then at the very bottom of this individual disclosure form, you can see how we will ask for the URL to basically be able to see the thought process of these individual students and how they're interacting with the AI.

Now, of course, students could just submit this and say, hey, I didn't use any AI tools and go on with their day. And that in fact is what I saw a lot at the very beginning of the semester when I first introduced this. And then I thought to myself and said, hey, you know what? They probably just don't know what I mean by this. And so what I started to institute in doing is that I would, in the middle of my lecture, if I use AI to help me with my course preparation, maybe I use it to come up with an in-class assignment or an in-class activity, or have me come up with a better example that's more relevant or more timely than the one from the textbook that came five years ago that none of the students would be able to relate to, I would have two-minute conversation with the students, show them the prompt that I used with the AI and said, hey, I used AI to come up with this in-class activity or this in-class assignment, and then we could talk and have a little reflection about it and say, hey, what do you think about that? Do you like that idea? Do you not like that idea? Was this a good use case of AI? Was this not a good use case of AI?

And I found the more transparent that I as the instructor was for the students in disclosing my own use of AI, I then started getting more activity and more responses from this AI disclosure form for my students. And this has been super helpful to give me insight as the instructor as to how students are thinking about using AI within their own work, and also for me to start seeing their thought processes. Are they just copying and pasting the assignment and putting it in ChatGPT, or are they starting to more thoughtfully use it as a way to go deeper, or engage more thoughtfully or engage with the content in a way that's more individualized to their needs?

Maybe they don't understand a concept for my class and they say, hey, here's this topic from class, explain Moore's law to me like I'm five, right? Because whatever the professor said doesn't necessarily resonate with me. And so the more transparent we can be as professors, the more transparent I think the students give you back as respect.

And the final way I was able to really get the participation in this type of an extra activity to get that bottom portion that's a little bit more open-ended is I told my students, hey, if you give me and you show me your work, like we're in math class back in high school, I say, hey, if you show me your work, I'll review it and I can give you some tips on your prompting.

Now, can every instructor necessarily do that or want to add that extra workload onto their case? Probably not. But I also found that as a way to kind of encourage the students to submit it and show me their work versus just checking a box or two and submitting the assignment as a whole.

So again, for our entire AI development, we've been super, super successful because we've got bought in from all levels of our community. We have it from the top down, we have it from our faculty, we have it from our staff, and now we have it from our students.

And so what we're working on at the moment is through our Institute for Applied Artificial Intelligence is to take all of the impact that we've had within our business community and to start to scale it to reach the entire campus. And so through this research institute, it's an interdisciplinary research institute that has a leadership team between both the business school and the computer science departments.

We have over 20 affiliate faculty fellows that span across the entire campus, including our law school, including our school of communications, our school of public affairs. Every single academic unit is represented on our campus and is included inside this institute in some capacity. And our goal with this research institute is to really start crossing the bridge over from our AI education into our curriculum from the business school and start expanding it out AU wide and having that innovation in AI education and promote our AI research all the way across campus and reach that AI native university that Siya mentioned at the very beginning, at the very, very intro of this presentation.

And so again, just to recap, this AI transformation that COVID has had over the past year, year and a half would not have gone as far or as quick or as deep if it wasn't without the cultural shift and the cultural change that we have been building week by week, day by day, month by month. And so as you all reflect on your own institutions, I would say having that Dean level support is really, really important and crucial.

If you still need some buy-in, I would encourage reaching out to your industry experts in your individual fields and domains and just having those honest conversations about how AI is disrupting that industry and really consider from the student perspective what those career outcomes look like for them and how AI is disrupting that. Because as we start thinking about the student impact, that's going to be where you can get a little bit more Dean level support most likely.

The second thing that I would say is invest in your community building, right? Have that social learning experience and expertise within your entire faculty, have the conversations, share about what's working, share about what's not working. And by having a really open community about how you're using AI, you're going to be able to move really fast with any kind of trial and error of using AI within your curriculum, using AI within your academic staff units, or using AI from the student's perspective.

And the final thing I'll say is try to get some external validation, right? At the very beginning, a year and a half ago, we have no idea what we're doing. I don't think that's a secret for anybody because we all still don't really know what we're doing. We're all just rolling with it and we're moving fast with trial and error. But the more feedback you can get from industry, the more feedback you can get from the experts who are using AI every single day within their companies and how they've seen the transformation with their own companies, that will give you the feedback from those external partners about what you're doing inside the academic area, if it's working or not.

Thanks so much for that great presentation, Angela. What I really liked is how you talked about how in the classroom, you also share where you used AI, where it worked, where it didn't work to kind of build that culture of transparency with the students. Because it kind of reinforces the message that all of us are using AI in some form or the other, and it's okay to do that. So really appreciate that.

So a couple of questions from our audience. The first one is about AI disclosure forms. So obviously, AI disclosure... Faculty insights into how students are using AI and then we can kind of, or ask better questions or engage more deeply with AI. But the question is that at a time when faculty is so busy with grading, how do you think about reviewing AI disclosure forms, especially if you do that with every assignment?

Yeah, so the first thing that I would suggest is prioritizing like your big assignments, right? So if you have a lot of different deliverables, a lot of different assignments over the course of the semester, I would say prioritize on the two or three big ones that it would have the most high impact and or the ones where you actually want students to be using the AI, right? So I think when we first started this AI disclosure forum, we really took it from a perspective of like, we just wanna see how students are even thinking about using it. And we've now since shifted and there are particular assignments where we like actually want them to use AI on this assignment for this particular reason. So I would say, think about it from what's the highest impact, highest value in terms of you as the professor or from the students needing to learn those AI skills that you might wanna review.

And the second thing is, yeah, I mean, I don't have a great answer in terms of it's not extra work, right? It definitely is a little extra work. I don't have a great answer besides, in theory, this is not what I do. I actually read through all of them. We come from a small private institution. So our class size is about 35. So yeah, we get a little extra work, but it's not a thousand person classroom that some of the more public institutions might be facing when we think about the scale.

Is in theory, you could, if your institution's okay with this, upload all those AI disclosure forms and then have AI actually summarize the key points, right? So you could then say, hey, 80% of your students used it for brainstorming, but no one used it for data visualization, right? And so at that point, you can get the big class generalizations with maybe five minutes of extra work. But again, I would refer to your institutions to see if that's okay. And within your own AI policies.

That's interesting. Jeffrey Busking, a professor we work with at Harvard Business School, says that he also uploads it with AI and basically assesses what students are saying. And he says he has a lot of insights on what students are thinking when they come into the classroom because of that. So a great way to find true information if your university allows that.

You talked about preparing students to be AI literate and career ready. What does AI literacy look like for you in 2025?

That's a great question. So for right now, we want every single one of our students to leave AU having knowing what AI is, how it works to a certain degree, right? We're not talking about building your own technical one, but at least understanding the data inputs, what a fair model looks like, why there might be some biases in the model, what a hallucination is, so that they can at least talk about some of those things or be aware about it from a responsible standpoint in the business world. And when we think about that career readiness portion, that's changing every single day, to be very honest with you. I swear every single day, every single week, we talk to a new employer and we learn something new about how these companies are now integrating it deeper.

But for right now, I would say it's prompt engineering and it's understanding enough about your individual domain. So this is typical learning outcomes, marketing, finance, accounting, whatever your individual niche is, and having enough expertise in there that when you couple that with the prompt engineering, you can 10X, 15X, 20X your individual output and workflow.

Great. The next question is from Valerie, who's a professor at Empire State University. She asked, do you teach online? And if so, how did you adjust your course design?

So I unfortunately do not teach online or maybe fortunately, depending on how you think about that. So all of my classes are residential.

residential, in-person, I don't have an answer for the online piece. How do you think about academic integrity in the age of AI, and how do you think about over-reliance on AI, especially when you think about student progression?

Yeah, so I think personally me, I'm more concerned about the second one than the first one, right? For me, my honest opinion on academic integrity is students have always found ways to cheat, they'll still find ways to cheat, this is just another tool or another method in which they could do it, but at the end of the day, if they aren't actually going through the learning process, and they're just using AI to do all of their assignments and all of the work for them, and they aren't using it as a co-pilot, they're just cheating themselves at the end of the day, right? So I'm not going to speak too much more about the academic integrity, but when it comes to the over-reliance, I think this is where, within our classrooms, we really need to start thinking about leveling our AI usage, right?

So when we think about our freshmen, we want them to understand what it is, how it works, and to start understanding some of those basics, but at the end of the day, they're still freshmen, and they don't have the industry knowledge or the domain knowledge to really get the most out of these systems, because, for instance, I was teaching my class, they're freshmen, they're 18 years old, they're coming in, they think that every single thing that comes out of ChatGPT is 100% right all the time, and they were shocked when I gave them an assignment where I said, hey, where are you an expert in, right? Because if I'm telling you to go use AI to go look up some fact about information technology, none of you are going to have any base knowledge to say, yeah, that's right, or no, that's not.

So I gave them 10 minutes, and I said, hey, what are you an expert in? Do you play tennis? Are you a ballerina? Like, whatever it is, just spend five minutes with ChatGPT, have a conversation, and when we ended that, my students finally realized what I was talking about when we talk about those hallucinations, where they were like, oh, I need to fact check this, right? Like, I actually can't just assume everything's right 100% of the time. And so I think little activities like that, where you don't only showcase the highlights of AI, but also where it can kind of go wrong, is a way to kind of build or kind of decrease that over-reliance on that AI technology. A lot of us that are millennials went through this with the Google era as well, and insisted on like verifying the results from Google or Wikipedia. So deeply resonated with like, what I was the Wikipedia kid. Yeah. And so all of my professors were like, you can't use Wikipedia as a source, but you can go to the sources that they gave you, and then kind of cross-reference some of them, yeah.

The next question comes from Yufan, who is an assistant professor of marketing at Cal Poly. Have you considered using AI for grading, like different types of assignments? And what has worked, what hasn't, and do you have any best practices?

So we have a little AI committee, and over this past spring, we kind of tasked our AI committee to think about the grading component. As of right now, our policy is we don't want to use it for grading, just as we probably don't want to use it for merit reviews or reappointment paperwork. But what we've come up with as the middle ground is we could use it as maybe for some feedback, right? And especially if it's students wanting feedback throughout their learning process and experience. So that's something that we're still noodling on and thinking about from that grading component, because if you talk to any professor, grading is always long and tedious, and the thing that we want to try to minimize a little bit. So that's still a work in progress for us over here.

But you are using it for like lesson planning and basically creating assignments for the classroom?

Yep. And then, like I said, I always try to use that disclosure with the students. So I'm super upfront and said, hey, I used AI for this, or I used AI for that. And that's been super, having that transparency, I think, has been helpful from the student's perspective as well as from ours in building that AI trust.

Giorgio, who's a professor at UCSF, said that they're super interested in your in-class use of AI to demonstrate to your students proper prompting. Can you talk a little bit more about how you teach proper prompting and if there are any good resources you'd recommend on that?

One really good resource that I have actually comes from OpenAI. They recently released a 30-page prompting book. I don't know, Siya, if you have the link or someone from your team can grab that. But that one really walks you through what prompting is. It goes through all of the steps, all of the things to consider, the formatting, different ways to be super, super explicit. On a fun note, I guess, or a different way to think about it, I always task my students with a basic story of, give me instructions of how to make a peanut butter and jelly sandwich. They'll always be like, oh, well, you just put peanut butter on bread. I'm like, well, where's the bread? I don't have bread yet. It's kind of an interactive way to get students to think about how explicit or direct or creative that they need to be with the instructions that they're giving an individual because we can't always interpret exactly what you say. I think that's kind of a fun, low exercise to get students to think about prompting in a slightly different, less technical sense.

Great. And we're dropping the prompting guide in chat so everyone can access that. Faculty change is often the hardest part of any transformation. What worked in encouraging your colleagues to adopt AI into their courses, especially colleagues who may not have been eager to do so?

Yeah. I mean, faculty buy-in is still a challenge and it's always going to be a challenge, but I'll say that works really well for us is at the very beginning when we started our AI journey, we identified a handful of key individuals who, no matter what we came up with, would be using AI anyway because we were super innovative or we're really interested in AI or already had some research in there. And so we relied on those individual contributors as your framework, see, at the very beginning, right, where you have those small individuals who are kind of off doing their own thing. Maybe they have a little tutor GPT for their class, and we relied on them to kind of start having that community building and do a little demonstration.

So one of those first technical trainings that we did is we actually just highlighted in a very public way what all of our individual faculty from across those disciplines of marketing and information technology and finance and accounting were already using AI to basically then try to encourage some of their department of like, even though I'm an accountant, look at, you know, so-and-so who's doing this or look at so-and-so doing that to try to break the barriers and then also to try to encourage some cross-pollination because I know in academia we typically like to stay in our silos.

And then the second thing that I'm going to say, which may or may not be super popular, is I think you really need to think about your faculty and come up with how do you get that middle group who's interested, but might not have the time or is interested in AI, but really focuses on research and really focus on that middle interested portion of your faculty who you can kind of get bought in if you find those interesting niches that would pique their curiosity of the technology, which can then expand into other areas. There's always going to be a bottom 15-20% who are either just, you know, they're almost at retirement or not super interested and I think you, you know, this is going to sound bad, but like kind of ignore them a little bit. And you can listen to them and you can take their concerns and you can take their questions and you can help them out. But there's a very innate drive that needs to happen with the faculty that they're either going to have or not. So start with those big individuals who are going to do it. Start growing that middle portion of professors who are interested, but might not know exactly how to get started. And then for those who aren't interested, they might never be. So focus on where you'll get the most return.

This is something you had brought up earlier in your presentation too, but would love to hear like, I guess, how do you think about the cost of waiting too long for AI adoption on a campus?

Yeah, I mean, we took the approach of we have to, because industry was telling us we have to, right? And we couldn't wait because we knew if we waited, that means that none of our students are going to be prepared when they have to go look for jobs in the job market, right? Because there's going to be the shift in student expectations or preparedness on the job market. And so we took the approach of let's just see what we can do in terms of really, really small changes and let's run with them and then we'll grow over time at a much, much higher scale.

So we're up to 58 classes now that have AI infused into them. When we started this past fall, over last summer, our goal was 20. We wanted to identify 20. Most of them are coming from like our IT or our upper level classes that already have the traditional machine learning or like really kind of paralleled really nicely with AI. And as we kind of then did those small pilots and were able to get more and more buy-in and get more comfortable with the technology and then see the different applications of it, we then were able to expand really, really quickly. But I think for us, it was always small pilots, go as fast as we can, try it out and then see where we can go from there.

And as you've changed like the classroom, the curriculum process, have you noticed any like changes in the quality of student work?

Yeah. So I am always amazed and impressed by the quality of like presentations, the quality of their presentation skills now. My students will use AI assisted presentation generators a lot all the time now. And I'm just like, oh, they look nice. Like they look way better than, you know, like a white template with just like three slides or, you know, paragraphs of text, which I used to get all of the time. So I would say like the final deliverable, I'm blown away almost every semester with some new high quality thing. And like I kind of talked about in the presentation, I think we're now starting to shift our expectations of we can now go deeper, right? Like making a presentation doesn't take five hours anymore. It takes 30 minutes if you use the right tools or you know how to use the right prompting systems, right? So we can kind of go a little bit deeper in some of the topics or have higher expectations in those final deliverables that would be nearly impossible at particular levels without the use of AI.

How do you think this is going to impact the job market when your students graduate and enter the workforce?

Yeah. So we're hearing positive feedback from all of our external partners, all of our, you know, collaborators in terms of career outcomes. Um, specifically in the financial industry section, like that whole industry is getting disrupted and especially in the consulting industry, which are like our two big.

big career outcomes for our business school students, consulting is changing as well. And so I think, as I said, they need the expertise in their domain. They need that vertical expertise. But the more and more that they can understand AI, they're going to crush getting jobs and getting careers. Amazing. We'll take two more questions from the audience. One is from a professor at Santa Clara University about how they're thinking a lot about how to change their assignments to make them AI ready. Any suggestions on how to think about what types of assignments to build that cannot be easily completed by AI?

Yeah. So I mean, it's really going to be dependent on your individual domain and your expertise. Something that I instituted this past spring, actually, is I shifted from my Python and R classes, kind of how I think about homework. So before, pre-AI and the before AI world, I would give them a pretty chunky homework assignment where I'd have them fight through some code and fight through some syntax, and then give them interpretation questions. Because from a business perspective, I don't really care if you can be a technical coder. I want you to understand the outputs and what they mean and what that has business value for or how you'd operationalize those data-driven insights. And so I'd have all these interpretation questions.

And so what I've shifted this semester is I'll still give them a small homework assignment, but now it's just the code. And I say, hey, I want you to get the code. But then when you come to class a week later or two weeks later, you're going to have a small quiz. And inside this quiz, I'm going to talk about the coding process, why we use this method over that method, along with some of those interpretation questions of like, OK, you got this output from the code. What does it mean? And at first, it was unpopular with my students. But when we wrapped up the semester just a few weeks ago, half of my class came up to me and said, thank you for giving me these quizzes. Because without it, it really held me accountable to understand what I was doing and not just go through the code, go through the interpretations, and not learn anything. So that kind of goes back to that other question you talked about of how do we ensure that they're still learning or not over-relying on the tools and the technology. Really big questions for faculty to answer right now.

The last question, Angela, is about what are some of the custom GPTs that you've seen around campus that you think are really great? People would just love to hear some more ideas.

Yeah, so we're working with our grad admissions department. Because as you imagine, whenever you have your prospective students come in, they always question. And they ask pretty much the same questions over and over again, but just different students. And so we're working with them to see if we can come up with a GPT custom-bought kind of solution that would be able to tackle a lot of the easy questions or the repetitive questions. I'm partnering with our career services department to work on creating GPTs for not only preparing the students in terms of resume preparedness, interview preparations, but also from the staff perspective as a way to automate some of that networking, those warm leads, understanding who's showing up at an event. I think I talked to you, Siya, about this. But whenever I prepare for a conference now, I'll take the list of who's presenting or who's in charge of this thing, throw it up, or who's attending, throw it up into the AI. And I have a GPT that I can just put that conference name with a list of people. And I'll go scrape everyone's LinkedIn. It'll tell me who they are, where they worked for, their biggest pain point, anything that they've been in terms of big news to help prepare it as well. We have a writing tutor coming in for business-style writing. We have lots of more ideas than we probably can talk about right now. But everything is getting a little GPT, I'm trying to think. On the student side, our student clubs, they all made their own little GPTs to help with all of their content and information. And what advice do you have for someone who's trying to think about ideas of where to include GPTs on their campus?

That's a great question. I would say, figure out what the big pain point is on your campus, right? The more buy-in you can get or the more people who are like, yeah, that I absolutely despise doing, or it's the biggest pain in the butt, or whatever it might be, the more you can get more individuals to help you with or to test with. And again, it's that whole social experience. So I'd say, what pain points currently exist, and do you have a particular scale? Is this going to reach five students or five faculty, or is this going to reach a lot of people? And then the last thing I'll suggest, if you're getting started, is think internally, right? Because that's a lot less risky than trying to do a lot of external communications or external validation. So those would be maybe where I would suggest starting. Again, I don't have the context of who or where or what problems are, but there's some frameworks out there that can help with identifying those AI workflows and what might be the best first use case.

Amazing. Thank you, Angela. In chat, we're also going to drop a resource, which is our EDU Hub, where anyone can learn how to prompt better and also think about custom GPTs, how to build them, and where they can go on the college campus. So that's going to be available in chat now. Thank you so much, Angela, for taking the time today and answering all of these wonderful questions from the audience.

Thanks for having me again, Siya. And for anyone who's interested in working with us, we're dropping the link for ChatGPT education as well. Feel free to reach out to us. We're really excited to partner with your university and help you become AI native. Thanks so much, everyone. Have a good night.

+ Read More
Like
Comments (0)
Popular
avatar


Watch More

Teaching Data Science with AI at Harvard Business School
Posted Jan 16, 2025 | Views 26.8K
# Higher Education
# AI Adoption
# Data Science