(NewsNation) — Don’t believe everything you learn from AI.
An AI chatbot joined a mushroom foraging group and recommended ways to cook a deadly mushroom, 404 Media reports.
“FungiFriend” was added to a popular mushroom discussion Facebook group and encouraged members to “sauté in butter” a potentially dangerous mushroom. The chatbot also said it could be added to “soups or stews, and pickling.”
A member of the group asked “FungiFriend,” “How do you cook Sarcosphaera coronaria?” which contains hyperaccumulate arsenic and has caused a documented death, 404 reports.
404 Media previously reported on artificial intelligence foraging books on Amazon that incorrectly suggest poisonous mushrooms are safe to eat.
The Facebook group, which includes some 13,000 members, is a place where mushroom foragers can ask each other to identify mushrooms they found in the wild. A moderator for the group said the chatbot was automatically added by Meta and that “we are most certainly removing it,” 404 Media reports.