The original commenter @[email protected] did neither specify their own sex nor the gender of the people they tutored.
Multiple people under that comment simply assumed that OP is male and was tutoring girls. That is heteronormative. Yes, I formulated that with a bit of snark. But come on.
IMO these issues are mainly with the interface / how the AI summaries are presented.
The issue with incorrect answers like the glue on pizza one isnt "hallucination". The LLM is pulling that info from an existing webpage (The Onion). The thing they need to change is how that info is portrayed. Not "one tip is to use glue", but rather "the satirical site the Onion says to use glue".
Hallucination should be combatted by the fact that the AI cant show a proper source for facts it made up itself.