What is GenAI?
The LRC's Curriculum Support Assistant Laura Paciorek dives into generative artificial intelligence and AI literacy.
Jan 31, 2025
Are you using generative artificial intelligence (GenAI) in your personal life, school, or work? Is the kind of GenAI you are using narrow, general, or artificial superintelligence? If you aren’t sure, or feel uncertain about GenAI, you are not alone! While many people think of AI as human-like (Hornberger et al., 2023), or general AI, applications like ChatGPT, Gemini, Copilot, Claude, and others are narrow AI: “it is very goal-orientated and uses machine learning techniques to achieve one specific goal or task” (Marrone et al., 2022, p. 1). These tools are language models: they take text prompts and generate human-like responses using immense amounts of data to inform the output. Some tools use current internet data to craft responses (e.g., Microsoft Copilot), but others are trained on specific data and do not scrape the internet (e.g., ChatGPT). Large language models can be quite useful, but they can also lead to issues around bias, information outdatedness and fabrication, and privacy concerns. Luckily, the Learning Resources Center is here to help students learn about GenAI and use it effectively for tasks like studying, time management, and planning.
To enhance AI literacy, it's helpful to explore some of the concerns about GenAI. This will begin to uncover what aspects of the GenAI process require that humans apply a critical lens.
Bias
Realistically, the data GenAI uses doesn't reflect all voices and experiences (UNESCO, 2023). When thinking about tools that use current internet data, consider the people who get their ideas published and then consider who might not have their ideas published. The missing voices and data can be a concern. Bias can prevail. This can be a concern with any research, but it might be easier to identify and address bias when the data used is cited transparently. Not all AI tools provide this level of transparency. Just as a student needs to think critically about the usefulness of an article, students must think about what GenAI produces and if it is biased.
Falsification
Similarly, during data analysis and response generation, GenAI might use incorrect data, misinterpret data, or simply make things up when there's no appropriate data to shape the output. GenAI doesn't evaluate the quality of data or sources. At times, GenAI can confabulate, fabricate, or falsify information (Emsley, 2023). GenAI is not thinking. It's just generating a response to a text prompt with human-like text. Users must think about the potential for false output.
Privacy
User privacy is another area to consider. Each tool has a privacy policy and terms and conditions. Reading these can take time but can help users make decisions about what feels comfortable and right when using GenAI. In some cases, there could be confidentiality issues around using tools if the user inputs sensitive information. User information that is input into GenAI can become part of the tool’s training data and show up in output for others. Before entering anything into GenAI, consider if it would be okay to post that information on the internet directly.
AI Literacy
AI is complicated and there is a need for AI literacy. Ultimately, this article is a basic overview and only touches upon the surface of what GenAI is doing, but it is a start towards understanding GenAI and some of the concerns that come up with using these tools.
Sometimes fears about GenAI can prevent students from using it. In other cases, students use GenAI without really knowing what it's doing. Both can be problematic in different ways. Marrone et al. found that students who are familiar with AI are more comfortable using it. The Learning Resources Center (LRC) understands that navigating GenAI doesn't feel comfortable for everyone and even power users may have things to learn. The LRC aims to provide a comfortable environment for exploration of GenAI as a learning experience.
The LRC will hold a spring semester workshop about GenAI and how to use it to promote student success. In this workshop, a variety of well-known and more obscure tools are explored. Sample prompts are fed into different tools to see how the output can differ. For example, asking Copilot, Gemini, and ChatGPT to create a one-month plan for completing a large project can yield different results depending on the tool. The LRC has done some of the exploring for students so that they can find tools and strategies that work well for them as an individual. The LRC invites users of all levels to attend the workshop to share their own AI use cases and experiences and to create a community of learning around GenAI.
Study Buddy Using AI as a Study Tool — March 11 at 11:30 a.m. - 12:15 p.m.
The LRC also provides one-on-one coaching sessions for students who would like to explore GenAI in a safe environment and learn how to use GenAI in ethical ways for tasks such as studying, time management, and planning. For students who need or prefer an asynchronous option, please check out the LRC Canvas page for modules about AI and enroll today! If you have questions about how the LRC can support your academic success, please reach out! You can email us or visit our website for more info.
References:
Hornberger, M., Bewersdorff, A., & Nerdel, C. (2023). What do university students know about Artificial Intelligence? Development and validation of an AI literacy test. Computers and Education: Artificial Intelligence, 5, Article 100165. https://doi.org/10.1016/j.caeai.2023.100165
Emsley, R. (2023). ChatGPT: These are not hallucinations – they’re fabrications and falsifications. Schizophrenia, 9, Article 52. https://doi.org/10.1038/s41537-023-00379-4
Marrone, R., Taddeo, V., & Hill, G. (2022). Creativity and Artificial Intelligence—A student perspective. Journal of Intelligence, 10(65), https://doi.org/10.3390/jintelligence10030065
United Nations Educational, Scientific and Cultural Organization (2023). Guidance for generative AI in education and research. Retrieved from https://unesdoc.unesco.org/ark:/48223/pf0000386693?posInSet=1&queryId=47013a3b-51c8-4d98-b22a-b297d266cd3f