Toward a Human-Centered Digital Ecosystem: NJIT
How do we ensure that we're teaching the ethics and philosophical issues that prepare our learners for the digital transformation we expect to see?
This is a broader question representing not just traditional undergraduates; it's about anyone who's seeking added education and fulfillment at any point in their lifetime. How are we helping them to understand not just the technology at hand, but how to use it more wisely? How can our institution lead the way in this context and help our learners and colleagues not only leverage technology for good, but also catch and intervene when the technology's not being used for good? How can we model and prioritize core values and norms that will help society counteract misinformation and unethical practices in social media and on the Internet? We can start by finding a way to reinvigorate the core curriculum with ethics for a human-centered digital ecosystem.
Grush: Those are significant goals for the 2030 plan! Could you give me a simple example? Have you been able to integrate some of that into the curriculum already?
Wozencroft: As a brief example, when we're offering a class on generative AI, our faculty talk through the pitfalls of it along with the power and potential of it, and how to identify where things may be going well, or going wrong.
To achieve this, a lot of my own focus has been on how I partner with our academic teams, primarily our deans, our department chairs, and our associate deans, to say that we have all this technology at our fingertips and we want to help you integrate this into the classroom in an effective way. We want to focus on giving students the tools that they need, and the tools that they're going to use out in the real world given an understanding of the implications of doing so.
To achieve this, a lot of my own focus has been on how I partner with our academic teams… to say that we have all this technology at our fingertips and we want to help you integrate this into the classroom in an effective way.
To be more specific, one of the things that I think we can use a generative AI tool for today, very effectively, is to help our students figure out how to ask the right question, by experimentation with different strategies. You may even challenge them and incentivize them to construct the wrong input, to demonstrate an understanding of why it's wrong. Or give them an answer ahead of time and say, "Now you need to go ask Chat GPT, or Copilot, or you name the AI to help you build the question to achieve a similar result."
Because so much of what we do in learning, research, or other work involves asking the right question, we need to train better in that. And that could apply for anything: Help me solve this math problem. Help me navigate this really tricky HR-related situation… I think we're well on our way to figuring out such things.
Because so much of what we do in learning, research, or other work involves asking the right question, we need to train better in that.
Grush: How do you gear up, if you will, your strategies toward the goal of building a really great digital ecosystem? And how do you keep that human-centered element on track and get the trust and buy-in you need?
Wozencroft: The majority of my job is outward-facing. I can talk tech all day long, but my preference is to understand the needs of our community clearly and speak their language. And that's a big part of how we crafted our 2030 strategy and how the digital vision has transcended from that.
The majority of my job is outward-facing. I can talk tech all day long, but my preference is to understand the needs of our community clearly and speak their language.
We had to understand what's happening outside of NJIT, as well as figuring out what's happening on our own campus today. We consider, especially, the changing demographics of the learner. An important part of it for me was becoming very conversant in the overall business of education — especially higher education, of course.