With resources like Google at our fingertips, information isn't hard to find. What is tougher is finding reliable information.
Google can actually make this harder, for two main reasons.
This kind of personalization can be helpful when you are looking for local weather, sports scores, or new music suggestions. But this can also lead to narrowing what kind of information shows up when we Google.
Google has integrated its AI program Gemini into almost all searches. Most other search engines also have AI "assistants," and AI results are now often the first thing you see in a search.
Gemini, ChatGPT and other AI assistants are multimodal Large Language Models (LLMs). LLMs scrape content from datasets (in this case, websites) and reorganize that content to mimic human speech.
AI assistants do NOT function like a search engine; rather, they patchwork information they find into human-sounding content. This often results in incorrect information - one study showed that LLMs get facts wrong over 60% of the time.
This means we need to be even more skeptical and vigilant about content generated by AI assistants in search engines, and verify everything an LLM supplies with actual sources. Sometimes, the AI assistant will supply links to websites, as well as a content summary.
It is ALWAYS best practice to follow information generated by any AI to the original sources to verify that information. Check out our Student Guide to AI for more information: